So we've agreed that the brain is a complex system that somehow manages to work as a control system for the body. It receives info from the body (input), info from itself (state), and produces info for the body (output).
For now, however, let's forget that it's a control system for the body and let's just focus on the fact that it's a complex system.
So what's a system?
Imagine a set of pieces that interact among them: a clock? the brain? a beehive? a city? they all are systems. Some more complex, some less. This explanation could become way more complex but there's no need (pun intended).
A complex system is also a system... only it has extra characteristics: 1) it's conformed by an unusually large number of pieces, 2) the pieces are very similar, and 3) the pieces have relatively many relationship among them.
Known complex systems are: The brain, cities, weather, economy, etc.
There's also an artificially made complex system called "Conway's game of life". We'll use this system as an example of them so that we can make a generic definition of what a complex system is.
So here's the explanation: Conway's game of life is an infinite grid of cells. Each cell can only have two possible states: Dear or alive. Every iteration each cell calculates its new state depending in its own state, and the state of the neighbors. How to know which one lives or dies?
Here's a table that explain how EACH cell calculates its new state.
Current state | Number of alive neighbors | New State |
---|---|---|
Dead | 0 | Dead |
Dead | 1 | Dead |
Dead | 2 | Dead |
Dead | 3 | Alive |
Dead | 4 | Dead |
Dead | 5 | Dead |
Dead | 6 | Dead |
Dead | 7 | Dead |
Dead | 8 | Dead |
Alive | 0 | Dead |
Alive | 1 | Dead |
Alive | 2 | Alive |
Alive | 3 | Alive |
Alive | 4 | Dead |
Alive | 5 | Dead |
Alive | 6 | Dead |
Alive | 7 | Dead |
Alive | 8 | Dead |
Wanna see it working?
This image shows how those cell end up working together in their grid. It's worth stressing the fact that the behaviors you see in the image are not programmed, they just appear. The only thing each cell knows how to do is calculate if it's alive or dead in the next iteration of the system.
Are we in the same page? Ok... If not, go tinker with it, draw something and see how the system evolves.
Now that we are on the same page, let's make a generic definition of a complex system, using the game of life as an example. We can then define every element/cell similarly to what we did before with the brain:
This definition works for the game of life, but what if we change it and make it depend on more neighbors, not only the 8 connected, but the neighbors of the neighbors? what if a cell calculates it's state depending on the average of deaths in the whole system? What if cells can move? You get my point, right? Neighborhood is not enough, to generalize, we need the whole system (the state of the whole system), and the state of each cell needs to be as rich as possible.
Now it should look like this:
Let's explain it. I ran out of symbols to explain the complexity of it (pun intended), so I'll explain it with words. The square and the asterisk mean that there are a lot of "elems". Now, the system receives the state of ALL the elements and return its own state (which is the aggregation of the states of each element). An element takes the state of the system, its own state, and then calculates its next state and passes it to itself and the system.
HOWEVER, the system circle in that image is only there to make explicit the interdependence between every element, but such system doesn't exist independently. The elements ARE the system. Get it? it should look more like this:
That, however, is so fucking messy. And there's only 5 elements, imagine a million, or like in the brain: 100 billion. So I prefer my previous definition. System referring to them all, and Elem referring to each piece of the system individually.
I don't know if you noticed something in common regarding the complex systems I talked about, but let's make it explicit: They converge to stability. Whaaaaat?
Yes, indeed. Complex systems try to implicitly find states that don't change so much. For example, in the game of life it means that every cell will try to remain always dead or always alive, meaning they'll try to be alive and have 3 or two neighbors alive, or be dead and avoid having 3 neighbors alive.
It doesn't mean that they will achieve it... it doesn't mean either that it's part of their behavior. It only happens. That convergence to stability is something that happens in most of those systems, and it will be something important to keep in mind from now on, because we'll eventually take advantage of such naturally happening convergence to stability.
For now, I understand if you don't believe me, so next posts will be about examples of complex systems and the measures we make on them to measure how they stabilize.
Comments
Post a Comment