Rama Ratnam

Academic interests

As I get older and mature as a biologist I find myself leaning more and more towards evolutionary biology, particularly the adaptation of biophysical mechanisms subjected to selective pressure. Of these mechanisms, the neural code, its origins, its biophysical substrate, and its subsequent shaping of animal behavior, is a deep problem worthy of investigation. The central question here, that is of interest to me, is how good is the representation of the external world in the sensory periphery? Has selective pressure resulted in the best possible representation of the sensory world? If so, how do we show this at both physiological time-scales and evolutionary time scales? I have been working on the neural code at a physiological level for some years, and I hope to pursue the larger question, that of ultimate causation, when I get more time.

In pursuing the coding problem, and other questions, there are some model systems that I go back to over and over again. They are, in some sense, my first love. In them I can see the eternal struggle between change and adaptation. They reveal with great clarity the structure of evolutionary processes.  

a)   The first system is the vocal communication system in anurans (frogs and toads). Male vocal communication behavior and female mate choice are among the most beautiful examples of the links between proximate and ultimate mechanisms in sexual selection. The males have one function, and one alone: to show up every evening during the breeding season, attract females, and donate sperm. Female mate choice exerts demanding selective pressure, and males have responded by shaping the dynamics of calling to increase the chances of attracting females. In 1989 Peter Narins (at UCLA), with Jeff Brush, was the first to suggest that the male calling system in the Puerto Rican Coqui frog is similar to digital communications. A single call acts  as a discrete "packet". Narins was prescient. Recently, and more than twenty years later, Doug Jones and I used microphone array processing to sharpen and extend Narin's findings, in the American green treefrog. We showed that these frogs perform a kind of optimized digital communication using discrete phase hopping to (presumably) avoid acoustic interference, while maintaining chorus synchrony. It is remarkable that a system of loosely coupled physiological oscillators has converged on to features found in digital comm. There is so much more to explore here.

 

b)   The second system is the structure of bird song and how it can change (I am not talking about songbird learning, but rather of sudden or abrupt changes in song syllables, in some direction).  The endangered golden-cheeked warbler (Setophaga chrysoparia), found largely in the Edwards's plateau in central Texas, has modified one syllable in its so-called Type B song. The shift is from a buzzy trill to an exquisite downward and upward frequency modulation that introduces a rather sweet lilt to the voice. Why has it done this, particularly when its Type A song is intact? In a field study spanning several years (with my collaborator Wendy Leonard, a Park Ranger with San Antonio Parks and Recreation) we showed that there is a definite modification of a particular song syllable from songs recorded in the early to mid-1990s and those recorded ten years later, and these modifications also occur over a wide transect of over a 150 kms. We have however, only scratched the surface and there are deep questions here, particularly concerning the malleability of one voice (Type B song) over the other (Type A song). The frog and warbler bioacoustics projects require me to be out in the field and collect data. Texas is great for that.

 

c)   The third system is the active electrosensory system of weakly electric fish. The neurons producing the oscillating electric organ discharge in the fish are among the most precisely timed biological pacemakers. In engineering terms the electric organ discharge is a High-Q, extremely narrow-band oscillator that provides a private channel for executing short-range sensing and navigation. I believe (as does my colleague Doug Jones) that this tuned narrow-band signal is the ideal, albeit highly specialized system to investigate optimal neural coding of sensory signals.

In neuroethology, and behavioral neuroscience in general, specialized sensory systems are interesting because they reveal biological mechanisms not necessarily obvious in less specialized systems (such as passive mammalian hearing). The sound localization circuits in the barn owl (Mark Konishi's work) as also echolocation in bats are two such specialized systems, as also jamming avoidance in wave-type electric fish (Walter Heiligenberg's work). The electric fish continues to provide wonderful insights into sensory processing, even though the electric sense is irrelevant to mammals (including the most egotistic mammal of all).

Another problem that goes back to my early years when I abandoned control theory for neuroscience is the problem of a central neural controller that regulates muscle tone to maintain posture and balance, and then actively guides movement when needed. If you consider the number of degrees of freedom available to the major joints (some 20 joints, at a minimum) and their coupling, including imposition of constraints and contact forces, the resultant Jacobian and Coriolis matrices become massive. The only direct feedback are through the vestibular system and the proprioceptive receptors found in muscle, tendons, and ligaments, and indirect feedback through visual and somatosensory systems. How is a controller structure even realized in this high-dimensional, highly nonlinear system? Which states are observable and controllable? Further, how does the controller adapt (or decline) as we grow older or suffer insults and traumas? This is a rich problem at the intersection of neuroscience, integrative physiology, kinematics, forward and inverse dynamics, and control theory. Surely, we can make a go of it, no?

Is it possible to remove the cochlea, replace it with a processor and a micro-electrode array implant that drives the auditory nerve, so that the nervous system can't tell the difference? A sort of bionic ear. Cochlear implants should do this but they do not recreate the same pattern of electrical activity as the intact cochlea. This is a nice plug-and-play problem, but I suspect really hard because of problems with figuring out the correct neural code, getting the anatomy right for surgical implantation, developing bio-compatible materials, etc. Moving towards this kind of neural implant is a worthy goal.