This is a computer-based concept that explains how various functions programmed into computers are combined, and recombined, to achieve new capabilities. The idea of combining functions on a computer is pretty old and very well established. Programmers have not only been strongly encouraged to recombine functions from the beginning of commercial programming, but a mark of success of a programmer is how well his or her work has created for re-use and how often re-use has paid off. By way of preface, this explanation starts with the most elemental functionality that programmers themselves use regularly in their work as creators of low-level software and extends to notions of recombinant systems. To understand recombination, it is also necessary to understand the notion of a protocol as it applies to a computers.
The following primitive example of a re-usable program is intended to help non-programmers understand a bit about how the idea of recombinant functionality is operationalized in programming environments at a low level. It's a bit more story and metaphor than actual history, but the concepts are way more important than historical accuracy.
Consider the idea of a function that gets the time from some kind of clock build into a computer system. Lots of computer programs need the current clock time for myriad reasons. Well, in the early days, this was not so trivial a matter as it has become today. Someone had to put a clock in a computer and then write some software to use it; that is, transfer some kind of representation of the current time to become data usable by a computer program. Of course, it would be silly in a business programming environment for every programmer to be required to invent this particular function considering how useful it would be for everyone, so computer programming shops would assign someone to write a program that provides the time or buy a computer along with the software that performs the function.
Now consider the problems that arise when this kind of reusability is undertaken. There is first of all the assumption that the one function that everyone uses provides the data from the clock in a form that is usable for all possible purposes. Then there is the program of arranging for everyone to obtain a copy of this software and then to figure out how to use it reliably. These may not sound like issues today when we're all so accustomed to sharing all kinds of data and capabilities over the internet without the slightest consideration of these matters, but what we take for granted today is the product of millions and millions of hours of problem solving provided by programmers and system architects to make such things easy to do.
Back to our problems. One way, in the old days, might have been for everyone to have a printed copy of the code that performs the function and each of them just types in a copy wherever it's needed. That would work fine, but what if someone finds a tricky little error in the code or thinks up a cool new feature. A new version of the code would need to be issued to everyone and they would all have to go back to the programs they wrote and fix them.
Unix Commandline Combinations
Non-geeks, this is for you. Please try to play along.
The commandline is what froze personal computer users in the old days. Instead of the marvelous graphical user interface (GUI), there was a blank screen with a blinking cursor, like a word processor, and it was up to you to know what to type. If you didn't know, there was apparently no way to find out, unlike the GUI that allows you to find out what is available because all the possibilities are provided as desktop icons, or menu items, or something. The problem was not that using a commandline-based system was so difficult in itself, but that learning about the possibilities felt tangential to the real purpose of the computer, whereas simply clicking on an icon using a mouse was memorable. In truth, once mastered, command line systems are straight-forward and efficient, and offer combinatorial opportunities that GUIs classically lack - which is where this discussion is going. By the way, Neal Stephenson wrote a pretty interesting book about the commandline.
Unix commands are little pieces of functionality that take part in an operating environment, called the commandline shell, that promotes recombinant behaviour by allowing the output created by one command to be fed into the input of a subsequent command. In this way, the creators of commands do not have to write incredibly elaborate do-everything commands, but can augment available functionality incrementally, counting on users to combine commands for their special purposes. A very simple example:
The ls command lists the files in a folder. The grep command searches for occurrences of strings in a file or a stream of characters. If you connect the ls command and the grep command together on the same commandline, you can find out if a particular file is in a given folder without having to look at a zillion-screen list if the number of files in the folder is likely to be incredibly large. Makes sense?
So far, I've discussed recombination in software. Consider that it exists everywhere. Putting a old Chevy engine in a boat is easy to think about. Automobile engines are designed for use in cars, but you'll notice that they don't build the engines into cars, they build them separately and put install them. This way, they can put the same engine into various cars and put various engines in similar cars. This isn't just modularity, but a species of recombination. Even though auto engines can't be replicated like chunks of software and glibly reused by typing a few characters, logically, the difference is only one of convenience.
Then there's the recombinations of text that everyone uses so easily. Boilerplate contracts are composed of pieces that people assemble into new wholes for different purposes. The idea originated long before the advent of word processors to save everyone the trouble of writing special new contracts for new purposes. With word processors, it's convenient as hell, but still, logically, the same act of recombination.
You could say that words of a language are part of a recombinant system, the language itself. We can even create new words to simplify the combinations into useful packets of ideas that can then be recombined further for additional purposes.
Finally, think about how people's skill sets are combined and re-used. You can form a new organization by drawing together people with all kinds of traditional skills and interests, yet the organization could be applied to achieving goals that no one has previously considered. Of course, people are not inert objects, but subjective entities with independent agency, so the parts that they can rely upon a much looser joining than mere objects. Of course, what if the objects were created to act as fairly independent agents. Have a look at Loose Coupling.
Recombinant functionality is an important virtue of systems, especially computer systems. Its value comes from the ability to create novel functionalities with the combinations. It works much better with parts that are designed to be recombined with other parts. It works best when the act of recombination has been simplified to such a degree that combinations can be created and discarded as a simple aspect of their use, or kept forever to represent new parts.