Skip to content

Conversation between Aaron Sloman, Anthony Leggett, Chris Fields, and Michael Levin.

Aaron Sloman, Anthony Leggett, Chris Fields, and Michael Levin discuss qualitative Austenian information, biological information processing, metamorphosis, creative information control, and distributed information encoding in living systems.

Watch Episode Here


Listen to Episode Here


Show Notes

This is a ~1 hour talk with Aaron Sloman (https://en.wikipedia.org/wiki/Aaron_Sloman), Anthony Leggett (https://en.wikipedia.org/wiki/Anthony_James_Leggett), and Chris Fields (https://chrisfieldsresearch.com/). The links to which Aaron refers are:

https://cogaffarchive.org/misc/whatlife.html

https://cogaffarchive.org/evol-devol.html

CHAPTERS:

(00:00) Qualitative Austenian Information

(17:07) Early Biological Information Processing

(29:37) Metamorphosis And Information Control

(47:31) Creative Reinterpretation Of Information

(58:05) Distributed Biological Information Encodings

PRODUCED BY:

https://aipodcast.ing

SOCIAL LINKS:

Podcast Website: https://thoughtforms-life.aipodcast.ing

YouTube: https://www.youtube.com/channel/UC3pVafx6EZqXVI2V_Efu2uw

Apple Podcasts: https://podcasts.apple.com/us/podcast/thoughtforms-life/id1805908099

Spotify: https://open.spotify.com/show/7JCmtoeH53neYyZeOZ6ym5

Twitter: https://x.com/drmichaellevin

Blog: https://thoughtforms.life

The Levin Lab: https://drmichaellevin.org


Transcript

This transcript is automatically generated; we strive for accuracy, but errors in wording or speaker identification may occur. Please verify key details when needed.

[00:00] Aaron Sloman: Something seems to be wrong with Chris's video, but if he can hear us, it doesn't matter.

[00:06] Michael Levin: I can see him.

[00:07] Aaron Sloman: Oh, well, now his video is not working. Well, I'm going to try to cover a variety of different points. The most important, I think, is a very recent idea, which is summarized in the document called whatlife.html. I don't yet know that I've got the best way of explaining it, but I'll try. That extends in an unobvious way things that I've been working on for several years, which are in the evol-devol.html file, which I've been working on and adding things to for a very long time. It originally had an entirely different name. Then I decided to change the name because of various things that happened. It has a huge number of links to other things in it. In principle, it could be very useful, but in practice, it's probably very hard for anybody to navigate. One of the things I'm trying to do now in that document is add hints for people who want to give suggestions about where to look and what to look at. Even so, it'll be a huge thing that's very hard to make use of and digest. What I'll focus on mainly now is the most recent ideas, which seem to me to be important, but I may be hallucinating things and you will correct me and say, no, you've got something wrong there or you may agree. It came out of thinking about how I've been using the notion of information and how other people use it, but mainly about how I was using it. I've written things about this in the past. For example, I've written criticisms of people who cite the Shannon notion of information as something measurable in terms of bit patterns. Now, when Shannon did that, he was doing a particular job for his employer. I forget the name, an electronics company in the USA. He was developing some technology for them. That technology made use of the patterns to produce a measure which he was pointing out could have some technological uses. Unfortunately, very many people thought he was providing at last a precise, mathematically clear definition of what information was. I don't think he thought that at all. In my comments on that in the past, I've said I think that he was well aware that his concept that he was defining there was very different from the concept of information used in the novels of Jane Austen over a century earlier, where she kept talking about what her friends did when they got some information. I'm saying that Jane Austen's notion of information, which is something like the notion that I'm using, or very close to the notion that I'm wanting to talk about, has all sorts of unobvious and subtle features, which I'm trying to describe and explain.

[04:22] Aaron Sloman: But I'm not sure how well I'll succeed, partly because of my broken brain, which gets in the way. But I'll try anyway. One of the main features of this notion of information is that it's definitely not numerical. It's something more qualitative. For instance, if I'm talking about the information that I'm trying to communicate to you, I don't think I'm talking about something that can be measured in any way at all. It's something more abstract than that. That doesn't stop it being useful. This is the key point that I'm claiming: biological processes make use of a notion of information that is not a measure of anything, and biological processes make use of it in deep and powerful ways. For instance, when deciding what to do next in some complex situation, they will use information about what the current situation is, and they'll also use information about options available, and they'll use information about the differences and similarities between those options in making a choice. None of that involves using anything remotely like Shannon's bit patterns. I'm also saying that what they're using is very hard to define. Attempts to define it explicitly are likely to go around in circles because you're trying to make use of a definition by saying things that will be equally hard to define, which are the only things available to explain what that notion of information is. I don't know how much sense that makes. If anyone has any comments so far, I'm willing to pause and listen. I should just carry on then. Why is this notion so important? Things that go on during processes of gene expression, processes of evolution, processes of development involve making selections between options that are currently available. There are things that are currently accessible and there are things that can be done to the things that are currently accessible. During development, an organism has to work out which things that it can access it should access. Secondly, it should work out what it should do to them to change them in order to continue its growth or developing new capabilities. What can it actually access when it's accessing that information and how does accessing it work? I'm probably going to end up being inarticulate because this is so abstract and difficult to explain, but I will try anyway. What I'm suggesting is that there is something that can be attended to and it can be distinguished from other things that can be attended to. In doing that, one is not doing any of the kinds of things that are normally thought of as computations or manipulation of structures that are well-defined, understandable structures.

[08:36] Aaron Sloman: One is directly using and choosing and doing things with this abstract stuff, which I call information, but which we don't often talk about explicitly, although we often do it unthinkingly because it is so useful. As I say, Jane Austen was doing it when she was writing her novels, and she didn't have any definition in mind of what she was talking about. She didn't need to have a definition in mind because the human brain is able to make use of this concept without having any definition of it. And how exactly it does that seems to me to be a deep problem, which is quite difficult to answer. And I'm not claiming that I can answer it. I'm pointing in the direction of features of this thing, which I'm hoping will eventually enable cleverer people, maybe the other people in this room, to say things and do things with this notion that I've been trying to do and so far finding quite difficult. Thinking about that with this fairly broken brain is quite tricky. I currently have quite a lot of information available to me about what's in my room. I can see that you can access quite a lot of information as well. Some of the things you have access to are, for example, that there's a bookcase behind me. I can see there's a bookcase behind Mike. There are other things behind Chris. What's in that bookcase? You can't really tell. But you can probably tell what kinds of things I could do that would make it easier for you to see things. That's in the bookcase. In that bookcase, for instance, you can work out that if I were to move myself to the right, my right, your left, depending on what this system is doing, you would see more of what is behind me. If I just move my head, that shows you a lot less than if I moved my whole chair. But you can still see what I'm talking about, that there are things that now become visible that were not previously visible. So you have information about what information is there and information about what I can do in order to make more of that information visible, and things that you might be able to do. For instance, you might be able to change the view you're controlling by enlarging portions or the magnification. I can't at the moment think of good examples of what you might be able to do.

[12:51] Aaron Sloman: So that raises all sorts of questions: first of all, how does this concept of information come about in the evolution of our species? Was it there in the very earliest organisms or not? What were the very earliest organisms? My current hunch about that, which you may have heard me talk about in the past, is that the earliest organisms from which we are descendants are actually ancestors of our synapses. Our synapses are things that contain a lot of biochemical compounds interacting in various ways. Those interactions, I am suggesting, involve doing a lot of processing of information in ways that are not currently understood. That raises the question: what can we do in order to understand them? Maybe if we try to look at how they evolved from much simpler organisms, and what those simpler organisms did to lead to current synapses, we might be able to get some clues. My suggestion is that originally, at some point in the history of the planet, there were single-celled entities that were somehow able to interact with their environment in rather limited ways. How can a single-celled organism interact with its environment? If it's embedded in chemical soup, what might it do? It might, for example, try to let some of that chemical soup into itself. Or it might let some of its own contents out into that chemical soup to interact with the things that are already in that chemical soup. How would it choose between such options? How could it control them in order to get some future benefit? For instance, increasing the things it's able to do. I'm suggesting that at this very early stage in biological evolution, these single-celled, very simple organisms were able to let some of their contents interact with other things that were in the environment by opening channels in their skin, and letting stuff out and controlling or interacting with what happened next. Before I say that, does anyone have any comment on that? Does it make any sense or are there any questions or objections about those points? Chris is a person who's heard me talk about this for the longest time, going back when we met at Sussex University, which I had completely forgotten until he recently reminded me. I don't think at that time I was thinking about these things, although I was moving in this direction. He's not saying anything, so I presume nothing has come to his mind about this.

[17:07] Michael Levin: Chris, do you want to say anything about the way that you and we have been thinking about the very early steps of information processing and the active inference? Is there anything you want to say?

[17:23] Chris Fields: I can say a few things and then let Aaron continue. I think it's very well understood that all known organisms, including all known bacteria and archaea, as well as single-celled eukaryotes, but certainly the simpler organisms as well, interact with their environments in very rich ways. In some cases, those interactions are reasonably well understood in terms of biochemistry. For example, in microbial chemotaxis, there are very well-understood pathways in which there's a good map of what molecules serve as switches, what molecules serve as memories, how long those memories last. How these decisions are actually made. What molecules import information from the environment and exactly what kinds of information they're able to import. There is a lot of very good biology there, some of which is known at a good enough level of detail to do modeling. With that kind of confirmatory statement, why don't you go on?

[18:58] Aaron Sloman: I'm going to suggest that some of what has to be said about this was discussed by a guy called Tibor Ganti. I don't know if any of you have heard of him. T-I-B-O-R and G-A-N-T-T-I. He, a long time ago, tried to work out what the minimal conditions are for reproduction that involves cells combining information by merging and growing things that come out of them. But one of the kinds of things he was saying is that the processes involve interacting in ways that involve two complex things sharing some of their structure, but not all of it. Because if they were sharing everything, they would then have to be the same thing. And in order to decide what is and isn't shared, they have to have some complicated way of communicating with one another. And this goes on during sexual reproduction, he claims, when cells that are starting to make copies of themselves start producing contents for the copies, which are based on what's in the original cells. And in sexual reproduction, that becomes a very complicated process because you have to have stuff from the female being combined in a complex way with stuff from the male. And it's a non-trivial operation, putting those things together in a way that's going to work. But even just combining them into a single structure, the new form of DNA is a complex, sequential process, about which you probably read things in the past and may know more than I do.

[21:52] Aaron Sloman: He was trying to show that these complications you're trying to give indications of how some of the interactions might be controlled that would limit the interactions and control them in fruitful ways. And those had to evolve and change over time as things became more complicated. But even so, that involved a lot of biochemistry. I have to try to remember what I'm going to say next about that. I'm hoping I can say something about this concept of information that has these new features that are difficult to explain, but are important. I may end up becoming inarticulate because of the complexity of the subject matter. I wonder what happens if I try to get the "What Life" document on screen. I thought I had planned a better strategy than I'm now able to work through.

[24:50] Michael Levin: It's fine.

[24:51] Aaron Sloman: Let me think.

[24:53] Michael Levin: I think one thing we could talk about, I remember a while back, we had a conversation about your interests around instincts and certain animals that are born knowing how to do things. And you had some thoughts around geometry and things like that. I wonder if these ideas around information are meant to connect to that, or is that something else?

[25:17] Aaron Sloman: Yes, because the very simple celled organisms that must have evolved at an early stage would need to combine with other organisms in order to form more complex structures. In order to combine with them, they had to share information by producing gaps in themselves and allowing chemicals to flow into themselves from the other organisms. They're allowing chemicals to flow from themselves into the other organisms in order to share this information. Controlling those processes would not be at all simple. What I'm suggesting is that there were ways of doing that control that involve features of the physical universe that have not previously been understood, described, or noticed that enable that sort of communication and mutual interaction to happen, allowing things to share their uses of information. Now I need to be able to say something more about what those species are and how they work. The question is whether I can say anything useful about those features and how they work. The main points were in the whatlife document. What I've been saying in that is that the concept of information... I wonder if trying to get it on the screen would help. I think it probably wouldn't. The content of information involves, maybe I will try. So if I click on that, it involves โ€” I'll talk through it anyway.

[28:03] Michael Levin: If you want me to share the doc, I can try to figure it out.

[28:08] Aaron Sloman: No, I think not. The main thing is that the things that share information that produce living organisms and develop new ones have to be able to share structures. They have to be able to communicate. In order to do that, I am suggesting that they do something that is currently very hard to explain and goes beyond what humans have built. What happens now if I go to that? I'm going around in circles.

[29:28] Michael Levin: There was a provisional title for today that was something about insect metamorphosis and challenges to current theoretical physics. Maybe it was something around there.

[29:37] Aaron Sloman: One of the things I started thinking about very recently is what happens during insect metamorphosis. What you already know is that an insect can form a cocoon. When it forms a cocoon, interesting and complex things can happen. The cocoon will contain the insect in its original state. But while the insect is in that state, it slowly starts disintegrating parts of itself and using the decomposed structures, the molecules, reorganizing them in a way that begins to assemble a new organism. This is the very dramatic point I was going to make: the original organism might have been a crawling insect, such as an animal that crawls along the ground, feeding on stuff that it finds on the ground. Eventually, when it is ready to transform itself, it will park itself somewhere where it grows this outer covering, this shell, in which it decomposes some parts of itself. For instance, it decomposes the old apparatus for crawling around and other things that it was using to feed in the environment. It uses the matter to build entirely new structures. It can build wings. It can build a proboscis, which it will later use when it flies to flowering plants, and it parks on the plant and it puts its proboscis in and it sucks out nectar. In doing that, it can transfer pollen between plants. But in order to produce those new structures, it has to decompose very complex structures that are already inside this organism that has been growing while feeding on all the plant matter and other stuff that it was feeding on while crawling around. While decomposing that structure, it has to somehow disconnect the molecules from what they were previously connected to in the assembled original structure and reconnect them in new ways. How that happens must use mechanisms of control, about which very little is currently understood. I don't think anybody at the moment really understands how such a system can control the decomposition inside the chrysalis or the pupa, which is being disassembled, and how it can control the reassembly of those molecules to form this new structure, this new organism, which when it eventually emerges from the chrysalis, it partly chews its way out, enables it to do something that it's never done before. It comes out of this structure and can then fly; it has never flown before. It has crawled around on the ground or on plants, perhaps chewing stuff, but has never been able to fly. Now it not only has wings, but it knows how to use them. So that process of reorganization and recomposition has had to not only produce those new physical structures, but also new brain mechanisms used to control their operation, which is entirely non-trivial, because coming out of a structure in one place and flying to plants which can be some way away and landing on them in such a way that it can then drop its proboscis down into the plant to pick up nectar is a complex task. And at the same time, inadvertently, without intending to, it pollinates the plant.

[34:06] Aaron Sloman: When it's doing all that, it's doing entirely different things from anything it's done in its former life. So somehow the information about what to do in that process must have been there inside the organism while it was going around chewing stuff and feeding on material stuff. It must have been there while it was inside the pupa decomposing itself, but it wasn't being used. Somehow that process of decomposition and recomposition inside the pupa makes that information, that behavioral control information, available to the organism. I suspect nobody at the moment knows how that is done, or if anyone does, I'd like to know how. I don't know how easy it'll be to find out how it's done. My current conjecture is that there's something about the notion of information in biology that is new, that we haven't understood, that has to be used in order to produce an explanation of what's going on there. Clearly, there is information being used, being transformed, being reassembled, being shared, and so on. The question is, what is that information? How did it evolve? Where did it come from? What kinds of things can we say about it? I'm suggesting that's going to be a very difficult problem. I suspect it may be an important fact that this notion of information is not definable in any way that is currently used to measure or define information. Bit patterns won't work for it, for example. In fact, I think a lot of people were confused by bit patterns. Bit patterns were developed by Shannon. At last, people said, "Now we know what information is."

[38:34] Aaron Sloman: He came up with this measure of information in terms of bit patterns. But I think he understood very well that what he was measuring was something that was useful in the context of the technology that his employer needed, but was not the old concept of information, which I have called Austen information, because Jane Austen made use of it in a lot of the novels where she talked about information she was sharing with her relatives and her friends and so on when she communicated with him. What I'm now saying is that information that she was using, I now think has a very long history. It's a very powerful notion of information, and it cannot be defined explicitly. It's something that we can point to and we can say things about how it works, but we cannot come up with the definitions and say, this is what that sort of notion of information is. The question then is, what sort of things can we say? How can we say them and how can we begin to make it easier to communicate fruitfully about this notion of information and perhaps solve new problems that cleverer people than I am will perhaps make progress on, which I can't make progress on at the moment. I've already mentioned Tibor Ganti and his stuff and how he pointed out the complexities of combining information from two different sexual histories which are brought together in sexual reproduction. There is stuff in Wikipedia about origin of language which talks about the history of debates about origins of languages and possibilities that sign languages came first, but I'm going to ignore that for the moment. One of the things that I've talked about in the past is that deaf children have been shown to be able to invent a new language. If you bring them together, they don't have to be taught a sign language by people who already know sign language; if you put them together in the right circumstances, deaf children will start interacting and signing and then will eventually develop quite a rich language. That actually happened once in Nicaragua; I've given a link to that in the background information. There's a BBC documentary about it. Those deaf children somehow made use of stuff that was in their genome that got them interacting in ways that eventually led them to produce a new sign language, which they developed and shared. Later experts in sign language from other parts of the country interacted with these children and were gradually, with difficulty, able to work out how this language works. That was quite a dramatic discovery, and it indicated there was something in the genome of those children that just needed the right environment, namely interaction with other such children.

[43:03] Aaron Sloman: It also suggested to me that sign languages might have evolved before spoken languages. One of the reasons why I thought that might have happened is that ancient organisms precursors of humans, in order to be able to use the spoken language, had to combine in very complicated ways mechanisms for getting food into their mouths, chewing it, swallowing it, and digesting it. Then to combine that with mechanisms for breathing, which were used for producing spoken language, in order to cause the vibrations by breathing out through vocal cords, in order to produce the sounds that are used in certain languages. Getting all that to work involved very complex changes in the system of production of breath and swallowing. You may know more details than I can talk about now. They do require, for example, some things to shut during the process of swallowing, shut temporarily, so that the swallowing can happen without choking. That means that you can't talk at the same time as you're swallowing, or if you try, you get into trouble. It's really very complicated. The suggestion is that that product of late human evolution came long after the development of languages which used signs, which didn't have these problems about combining these vocal and digestive and other things. Because for your hands and your eyes, you don't have to do anything with your swallowing mechanisms or your breathing mechanisms, as you do with spoken language. That conjecture, which I've proposed in the distant past, still leaves open a whole lot of questions about how that evolution happened and when it happened and what features of the genome made it possible for those sign languages to be developed in our ancestors, if I'm right about that. That would be part of the complexities of information in our history, and the ways that it can function and the kinds of things that it can do that I think need to be unraveled in greater depths than has been done so far. Are there any comments or questions on that at the moment? Was there anything else I wanted to say? The "What Life" document which I've now got up here. Is there anything I want to read out from that? I mentioned Ganty, I mentioned the stuff in the BBC about the genome. Here I am just saying that I think there's this notion of information which I think we in some sense all understand and use, but we cannot define, and that it plays important roles in all sorts of things, including reproduction of various sorts, but also in communication and thinking and reasoning. I think there was something else I wanted to say about it. See if I can find it. I think I've covered most of that and I think that there's stuff to be done and I will go on trying what I can, but I suspect it's going to need help from other people maybe with better ideas than I have who will be able to take these ideas further, develop them further and perhaps produce new publications that will report on what you've done with those ideas. I don't expect you to have immediate ideas at the moment because a lot of them will be very unfamiliar and reacting to it. It requires some thought and that will take time. If you have any thoughts about it now, I'm certainly willing to listen to them.

[47:31] Michael Levin: One thing that we've thought about in terms of the butterfly and caterpillar thing is that there's an additional phenomenon there that's interesting with respect to the information, which is that if you train the caterpillar on specific tasks, to recognize a stimulus and then crawl to it and receive a reward in the form of leaves. The butterfly will remember that original information, even though, as you pointed out, the brain and the body are largely dissolved. What's interesting about that is not only the aspect of storing the information while the material medium is being completely refactored, but also the fact that the memories of the caterpillar are of no use to the butterfly because it's a completely different organism with different reward structures. It lives in a 3D world instead of crawling around the way the caterpillar does. That information needs to not only be kept intact, but it needs to be reinterpreted and remapped onto a new body architecture because you're no longer crawling towards leaves. You're now flying to find nectar. The fundamental lesson that you learn, that here's a stimulus that will get you food, is the same. We've spent a lot of time recently thinking about the way that information gets abstracted and compressed into a generative representation that then can be remapped, not only can be, but has to be remapped onto a new scenario, a new body, a new environment, new control effectors. That aspect of it, the compression might be algorithmic, but the expansion is creative. There is no algorithmic way to do that because you don't know what the new situation is going to be. Re-inflating those memory engrams in a way that makes them applicable and salient in a new scenario is a creative aspect of this. We have a machine learning group in our lab that studies certain architectures that do this too. They have a very small node in the middle of all these other layers that forces a compression and then a re-expansion, like an autoencoder. It's not clear yet how that process happens. I think it will be a big thing for the future to understand how the information isn't just held onto, but actually remapped in creative ways. Your other question about the actual body structure is fundamentally the same problem. That is, whether you're an egg or a caterpillar that has to re-scrap most of the body and start again, it's similar. You're handed down certain information from the past that's genetic, cytoplasmic, and various other things. Now you have to interpret that information in a way that makes sense. You might be able to do it the same way that prior generations have done it, or you might have to get creative in the case of the novel things that people do. Our xenobots, anthropots, different kinds of weird hybrids and strange new creatures that have never been here before, you can't just interpret it exactly the same way that your previous generations have done it. You have to do it in new ways. Biology seems to be very willing to do that. We see incredible plasticity in living material, being able to take the information they've gotten and not use it exactly as it has been used, but use it as affordances for new kinds of both forms and behaviors. What you've pointed out is the essential need to understand the reinterpretation aspect of the information, not just the encoding, not just the information content, but the observer on the other end trying to make sense of it and doing something useful with it.

[51:31] Aaron Sloman: Now, a comment about why I've involved Tony in this: I have a suspicion that explaining how those processes in the cocoon, the metamorphosis processes work, and some of the other things that happen in other aspects of development of insects and other organisms. Explaining in detail how they work, I suggest, may require changes to fundamental physics. What's going on there is that processes decompose parts of the existing organism that formed before it grew the cocoon or pupa. When it decomposes itself and moves molecules around inside that cocoon, somehow that process has to be controlled. The question is how are all those molecules moved to where they need to go to form an entirely new organism. Bits and pieces of the old organism decompose and reassemble to form this new organism, which has wings and different kinds of legs, and which has a tongue that it will later be able to put into flowers to suck nectar. Those processes of decomposition and reassembly inside that cocoon, I suspect, may challenge current theories of fundamental physics in ways that I'm not saying Tony would be able to explain or answer, but he has been thinking about ways in which current theories in fundamental physics may turn out to be inadequate and may have to be expanded. It's just possible that there is some connection between the things that I've been talking about and the things that Tony has worked on in the past, or it may be that they're very different or only very loosely related. I'm not expecting that Tony will right now be able to say this is how it works, but you may have some comment about that. Whether you would agree that there seems to be some challenge to fundamental physics in the processes that go on inside the cocoon. I think it would be very useful in this context if one could actually produce a proof that some fundamental principle of physics, for example the second law of thermodynamics, is being violated in these processes.

[54:47] Aaron Sloman: I think some of the most important progress in physics has come from impossibility proofs. The Bell theorem on local hidden variable theories. If one could somehow isolate some feature of these processes, which you're talking about, which can be clearly shown to violate some fairly fundamental principle, that would go a long way to making progress in this area. As far as I know, there doesn't seem to be such a principle, but maybe you can prove me wrong. I think the suggestion that there is a notion of information that's not definable explicitly, but which we in some sense understand and can use and is used in some of these processes, may be related to that, but I can only suggest that without giving strong arguments. It does seem clear that something like information must be involved in processes like sexual reproduction and the metamorphogenesis processes inside the cocoon. It may turn out that there's a collection of these processes in biology that depend essentially on aspects of fundamental physics that are not yet understood. Whether this example adds anything new to what people like Tony and others have already been looking at, I don't know. It may well be that there's something about this that might give new hints as to what is required for changes in physical theory to produce explanations of how such transformations can occur. That's just hand-waving. There may not be any such results.

[58:05] Chris Fields: If I could add one thing to the discussion. When we think about information in physics or in thermodynamics, we tend to think about a particular encoding that has some particular information, maximal information content or maximal entropy. But one thing that we do clearly know from biology is that there's enormous redundancy across different kinds of encodings. For example, the point you make about insect morphogenesis, as Mike mentioned, is quite analogous to the situation that happens in insects that don't undergo morphogenesis but are born from eggs. You have a single-celled embryo embedded in yolk, which is a mixture of various proteins and fats. That develops into an entire well-structured organism with kinds of behavior and everything else. We know that the information required to do that is not completely contained in the genome. Much of the information must be contained in the cytoplasm of the egg, in the cytoskeleton of the egg, in the membrane structure of the egg, in the bioelectric capabilities of the egg. So it becomes not a question of asking how some structure encodes some amount of information, but rather a question of how many different structures can be assembled by the same structures that encode it using some information from one place and more information from another place and so on. For example, the structure of a protein is not encoded in its genetic representation. It's encoded in a combination of its genetic representation and the information in the cytoplasm and the information in other proteins. There's no single locus of the information that's required to do any of these things. The different encodings use radically different languages. We know all of those things in biology.

[1:01:10] Aaron Sloman: I'm sorry, I have to rush because I have to catch a bus. I have to quit at this point. Thank you.

[1:01:18] Michael Levin: Thanks, Tony. It was good to see you.

[1:01:20] Chris Fields: Yes, thank you very much. Yeah.

[1:01:22] Michael Levin: Apologies. I'm going to have to run in about a minute as well.

[1:01:24] Aaron Sloman: I've got a meeting. On that last point, that may be related to the things I've written about that go on inside hatching eggs of vertebrates. You weren't talking about vertebrates there. You were talking about eggs of insects. Is that right?

[1:01:45] Chris Fields: Well, it applies to eggs across the board.

[1:01:48] Aaron Sloman: In the case of vertebrates, there seem to be various stages of development. At each stage, new structures emerge, which then have to be further developed. I have a long and complex online document trying to break down the different stages and how they interact. I at one time tried to produce diagrams showing the relationships between the stages and it got very complex, very messy. But I think it just fits in with what you were saying about the complexities of these things and how difficult it is to come up with simple theories to explain what's going on in those processes. I'm also suggesting that we may need some new changes in fundamental physical theories to get full answers to how those processes work. I don't know if that was something you were saying as well, but you may not want to go that far.

[1:02:57] Chris Fields: I think we can very safely say that theories that arise in this domain are not going to be simple and are going to be very highly context dependent. What we can say about what one organism does sometimes, and sometimes doesn't, does not generalize to what other organisms do. Nature has figured out a great many different ways of accomplishing the same sorts of tasks.

[1:03:32] Aaron Sloman: I'm thinking that there's something about fundamental physics that allows information that's very different from Shannon information. Information in this old Jane Austen sense of information, which I believe cannot be given any explicit definition in terms of simpler concepts, but which in some sense has been understood and used for centuries by humans in communicating with one another. I think that there is something there in that notion of information, which is very deep and will challenge fundamental physics if ever physicists start looking closely at the features of that kind of information.

[1:04:23] Chris Fields: We may know the answer to that in 50 years.

[1:04:27] Aaron Sloman: We can come back from another place.

[1:04:31] Chris Fields: Thank you, Aaron. Good to see you. It looks like Mike has had to go too, so perhaps we should.


Related episodes