Email - firstname.lastname@example.org
C.D. Deshmukh Memorial Lecture at the India International Centre, Delhi
Ladies and Gentlemen,
It is a great honour to be invited to deliver this prestigious lecture in the memory of the late C.D. Deshmukh, especially at the India international Centre that was C.D.’s creation and one of his most beloved projects. I feel somewhat inhibited to fulfil my assignment today, since from my childhood I had looked upon C.D. Deshmukh as a towering personality. I have memories of his visit to our house back in early nineteen fifties, when I was in secondary school. My parents asked me on that occasion to recite some Sanskrit shlokas. I did so with some trepidation since I had heard about how great a scholar C.D. was in Sanskrit. I also recall feeling very happy when the 'Guest of Honour' complimented me on my Sanskrit pronunciation.
On that occasion he had come from a function at the Women's College of the Banaras Hindu University. The university is often referred to as B.H.U. and C.D. used this fact in one of his typical witty remarks. He told the women students - as students of the B.H.U. were 'Bhu-kanyas' and as such they should emulate the ideal of Bhukanya -Sita.
I mention this incident as an example of the multifaceted personality of C.D. Deshmukh. He trained as an I.C.S. officer, then became Governor of the Reserve Bank of India, was a Finance Minister in Nehru's government, then Chairman of the University Grants Commission and later Vice Chancellor of Delhi University. He had translated the Meghaduta of Kalidasa in Marathi verse, besides being author of several other books and tracts. In all these avatars he was always conscious of quality and integrity. His principled stand on the Samyukta Maharashtra issue will always be remembered with great appreciation of his courage born of conviction. His creative achievements were many, this centre being one example of excellence emerging from his clear vision and meticulous planning.
It is a pleasure for me to dedicate this lecture on culture of science to C.D. Deshmukh who personified culture in all its aspects. I have a feeling that a person of his independent thinking would appreciate my concluding remarks.
It has become a cliché to say that we live in the 'Age of Science'. The ambience of science and technology is everywhere in our life. Unlike the case, say three to four centuries ago, when science was limited to its narrow band of practitioners, and its impact confined to the walls of the laboratory, today society cannot expect to ignore its existence. I this context I will try and present what I understand to be the culture of science and how it has evolved to its present state.
Some explanation is needed right at the beginning as to how science, which is considered neutral and objective, may have a culture of its own. What I mean here is the impact and practice of science, rather than science itself. How do the practitioners of science at any given epoch view their subject, how does the society itself perceive the impact of science, and finally, what bearing all of it has on the ethical values of the lime? That such issues carry relevance to society at large, and to intellectuals in particular even if they don't belong to the field of science, does not need to be justified.
Indeed, I may begin with a short extract from the writings of one such distinguished scholar, D.P. Chattopadhyaya. Distinguishing between civilisation and culture, Chattopadhyaya has said "In a philosophical vein it has been said that civilisation is what we do possess or have, and culture is what we are."1
One may adapt this statement to the present theme by saying that technological achievement of a society is part of its civilisation while the spirit that drives its science is part of its culture.
As an example, one may point to issues currently debated hotly, e.g., should science be harnessed towards deadly weapons that will effectively eliminate the human race from the face of this planet? Is it ethical to tinker with nature's reproductive processes through cloning? To what extent is there an intellectual property right on sponsored research if it is deemed essential for human welfare? I believe that such issues pertain to the scientific culture of the society.
Notice that the questions I just raised are the outcome of the present times. They were not, and would not have been asked a century ago. Nevertheless, the present cannot be appreciated without reference to the past, nor can it be relevant without reference to the future. Rather than confine the discussion to the present times therefore, it is worth viewing it in a historical context first and then trying to use it as crystal ball for the future. So far as history is concerned, I will not follow any chronological sequence, nor do I lay any claim to exhaustiveness. I begin with Europe of pre-Newtonian times, dating back to the eleventh century.
Let us use the time machine of H.G.Wells to travel to the eleventh century China. The then Emperor of China had a firm belief that he would receive omens from the heavens to guide his rule. Were he to depart from the path of justice and integrity he would be warned by some untoward event in the sky. So he had astronomers to watch the sky and maintain records of whatever they saw there. That is how the Chinese astronomers came to record an unusual event. From their meticulous notes one may fix the date as July 4, 1054. On that day these observers saw what they chronicled as the arrival of a ‘Guest Star’. This star suddenly appeared in the sky, was bright enough to be visible even during daytime and gradually faded away. The Chinese had a custom of designating a transient object like a comet or a meteor in the sky as a ‘guest’. Similar records are found amongst Japanese annals also.
In actuality, by a hindsight of nine centuries one can say today that what those sky-watchers saw on July 4, 1054 was the explosion of a star, the so-called supernova explosion in which the entire star is disrupted, ejecting its envelope and retaining a small compact core which is shot off in opposite direction like the recoil of a gun. Although the star did exist in the sky before the explosion, it was not bright enough to be seen by the naked eye. However, after the explosion, a supernova becomes extremely bright for a short while, so much so that it may outshine an entire galaxy of a hundred billion stars. No wonder, during the early days the Chines saw the star even during the daytime.
The debris of the explosion may, however be seen even today, more than nine and half centuries after the event. Now they are not visible to the naked eye, but can be seen through spectacular astronomical photographs as a diffuse but highly energetic cloud called the Crab Nebula. This name derives from the luminous filaments seen in the gas cloud, resembling crab’s legs. The cloud represents the debris left behind in the explosion seen by the Chinese back in 1054 AD. In the late 1960’s, astronomers also discovered the core left behind by the explosion. It is a pulsar, that is a highly compact star emitting pulses of radiation in an extremely regular manner.
It is interesting to see why these records were kept. As I mentioned earlier, the Chinese held the belief that their Emperor received signs from the sky as hints, advice or portents relating to his performance on the Earth. The Crab Supernova was duly noted as part of this belief. [History is silent on what the Emperor made of it all!] In the case of Ibn Batuta, the heavenly signs were believed to be related to epidemics on the Earth and as a medical doctor, he was interested in an epidemic then raging in the Nile Valley. So far as the Red Indians were concerned, the sight must have been unprecedented and unusual enough for them to feel the urge to record the event in the rock. While the lunar eclipse, the solar eclipse and comets constitute unusual sights; they are far more frequent than a supernova explosion. So far only three supernovae have been sighted within our Galaxy in the last 9-centuries, although more distant ones in far away galaxies are seen regularly but only with sensitive telescopes.
How did Europe react to this event? Well, the important fact is that searches in European manuscripts of those times do not show any records of the event. Like the episode of the dog in the Sherlock Holmes story, the dog that did not bark at night, this non-event is significant. One may ask: Why are there no records?
Historian of science George Sarton and astrophysicist Fred Hoyle have independently argued that the reason why the event was not recorded was because it was inconsistent with the then accepted paradigm about the origin of the universe. The Biblical belief that God created the universe in seven days was rigidly adhered to in those times. Under the circumstances how do you interpret a new star suddenly appearing in the sky? Rather than get into any controversy or face reprimands from their superiors for sacrilegious observations, the monks who kept all the records may have simply chosen to ignore the event! Lest it appears preposterous that an existing belief should dictate what you should observe and record, and what you should not even mention, I will later describe situations in modern times, which remind us how strong the influence of a set paradigm can be.
The Crab example illustrates how science was perceived in medieval Europe. It related to the study and observations of Nature but strictly within the religious framework. Which is why, 5-6 centuries later Copernicus and Galileo encountered tremendous hostility in their times, when they sought to cast doubts on the geocentric theory sanctioned by religion.
Take the case of Galileo's telescope. Although tiny compared to its modern versions, this pioneering instrument led to such important findings as sunspots, lunar craters and Jupiter's four nearer satellites. Yet the intelligentsia in Galileo's time did not take kindly to the telescope, for its findings questioned their deeply ingrained beliefs. If God created the universe in total perfection, how come the Moon has pockmarks on her beautiful face and the shining Sun has dark spots. And if the whole universe is supposed to revolve around the Earth, how come Jupiter has four moons going around it? No wonder, the telescope was dubbed an instrument of magic aid witchcraft, not to be trusted for observations.
The magnum opus of Copernicus was published when he was on his deathbed; he died shortly after receiving the published copy. It is now widely believed that the preface at the beginning of the book is not what Copernicus himself would have written. It seeks to present the new heliocentric theory, which was the lifetime's work of the author, very mildly as an alternative hypothesis rather than as a fact: as if the writer of the preface was anticipating violent criticism of the main text of the book. It is very likely that the editor or the publisher substituted this preface in place of a possibly more outspoken one by the author himself.
While Galileo was a staunch supporter of the heliocentric theory, he had openly attacked the Aristotelian philosophy of nature. His book on a dialogue between two world systems heaped ridicule on the Aristotelian ideas of motion, force, etc. While debates were not uncommon in those times, the normal practice used to be to confine them to philosophical arguments rather than to practical demonstrations. Galileo, perhaps the first scholar to rely on experiments to confirm a theory, demonstrated through cleverly conceived experiments, how several of the long held beliefs were untenable. This was what a good experimental physicist is expected to do.
Sensing a threat to its doctrines the Roman Church summoned Galileo before an Inquisition. Although he finally publicly recanted and accepted the views of the Church, it is said that privately he continued to believe in the heliocentric theory. In modern times, in the second half of the twentieth century, Pope John Paul II set up a study group to go over the historical papers relating to Galileo's persecution, with the result that the Church, some 350 years later publicly acknowledged that Galileo had been right and had been treated unfairly.
Reviewing the episode Olaf Pedersen has commented that Galileo himself did not have a convincing argument as to why he thought that the Earth moves and the Sun is at rest. His best argument (which later turned at to be wrong!) was that the splashing of the seas during tides must be because they are on a moving Earth. The real reason for tides lies in the Moon's influence on the Earth via the Law of gravitation, which had not been discovered by Galileo's time.
To summarise, the period 1050-1650 AD illustrates how science in Europe was perceived as part of the tenets of the Church. It began to have an existence independent of the Church only with the advent of the Newtonian Laws of motion gravitation, optics etc.
The seven centuries starting with around 450 AD, may be considered the golden period of Indian astronomy, spanning as it did the times from Aryabhata to Bhaskara II Aryabhata, in his book Aryabhateeya makes the statement that essentially means that the Earth spins against a background of distant stars:
pashyatyachalam vilomagam yadvat
(Aryabhateeya, Chapter 4, Verse 9)
This statement has been commented upon by subsequent astronomers of the above period see, for example the article by Bina Chatterjee describing a few responses.
Aryabhata’s Theory Rotation of Earth, Indian Journal or History of Science, vol. 9, p.51-54 (1974)] in none of the cases did any astronomer support this statement. Rather the attempt was to reinterpret what Aryabhata said or to 'wish it away' as something of an embarrassment.
Unlike the cases of Copernicus and Galileo, biographical details on Aryabhata are sadly lacking. How was his above statement received in the contemporary circles? Was there any religious dogma in India that required geocentric theory to be right? So far as I know, the prevailing religious or philosophical ideas of the day had no such specific bias. It is possible, that the geocentric theory may have come from Greece and may have gained a certain level of respectability. But this does not explain the cool reception accorded to Aryabhata. He was born in Kusumapura, near Patna in the present state of Bihar. Why did he go to Gujarat and then to Kerala as some versions of his life story have it? Was that because of local ridicule for making a statement that did not agree with the wisdom of the day? Alas we do not know. We do know from the book itself that Aryabhata was born in 476 AD and that the book was written in 499 AD.
The difficulty of answering these questions is typical of the kind that historians of science in India face: viz, lack of reliable documentary material. In a project supported by the Indian National Science Academy, I had looked for any possible record of sighting of the Crab Supernova, which I mentioned earlier. Assuming that there was no religious dogma against such a sighting as in the case of European scholars, and knowing that the event occurred during a phase when Indian astronomy was nourishing, one should have come across records of its sighting. Despite extensive searches into old manuscripts and books of the period (or later ones) my colleagues and I failed to find any statement corroborating the event. Did people in India really keep any records of its sighting? That they must have seen the event is certain, for the star would have been visible from the subcontinent, and monsoon or not, it was visible for a long enough period to be noticed by a large number of human communities.
The main reason advanced for the scarcity of any written material is that our tradition did not encourage writing and recording; the transfer of knowledge was largely through oral/verbal means. In addition there is the difficulty regarding settling the reliability of any written material? Was it written at the time of the original manuscript, or was it inserted later? There are also instances where a later manuscript simply reproduces what was written in another manuscript several centuries earlier. Thus even if one found a reference to an event, one cannot be sure that it appears there for the first time or has been lifted from an earlier manuscript.
In this context, I may narrate how I was fooled on one occasion. This happened when my attention was drawn towards an extract in the ancient book called Shukraneeti, spelling out in detail the welfare measures that an employer should provide (for the employees and their families. These included earned leave, retirement benefits like pension or provident fund, employee’s widow’s pension etc. with figures not too different from what we have today. Here was an example, I thought, of our ancients thinking of these social security measures well before the westerners...for Shukraneet is estimated to be written during the Gupta period, about 4th Century, AD. However, when I wrote about it in a newspaper article, I received a letter from a scholar giving well corroborated evidence that the concerned portion was inserted by someone in the employment of the East India Company and those welfare measures were none other than those offered by the Company to its employees!
Compared to China, Europe and the Middle East, our historical records of scientific work done in this part of the world, are therefore scanty, notwithstanding the works of the astronomers and mathematicians of the above golden period. This probably tells us something about the status of these subjects in our social perception. Did science as a means of understanding nature not enjoy a privileged status? Was recording data and interpreting its underlying pattern not part of scholastic studies? I may be open to correction, but I do sense apathy towards science as natural philosophy in our early traditions. This may well be the reason why science did not flourish in India in the next millennium until it was essentially imposed, in its by then well developed European form by our colonial rulers. One may also raise here the frequently asked question as to where our rajas and maharajas who lavishly patronised the arts and humanities ignored science and technology altogether? Their European counterparts did appreciate and sponsor science as part of culture and knowledge, and this was one of the main reasons why science took off so well in the renaissance and post-renaissance period in Europe.
Herman Bondi has commented thus on the success of a scientific theory: it is 100% if it is tested and accepted, 50% if it is violently resisted and rejected, and 0% if it is ignored. Much though one may condemn the treatment meted old to Galileo; he at least achieved a 50% success on Bondi's scale, whereas Aryabhata scored 0%. Because unlike Galileo, who stirred up a hornet's nest with his ideas and experiments, Aryabhat does not seem to have created any impact, barring a few light ripples amongst a select few astronomers like Brahmagupta.
With the success of Newtonian framework in Europe, science broke away from the shackles of religion and a materialist and empirical approach came in its place. The culture of science changed as the subject advanced in a dramatic fashion during the 19th and the 20th centuries and led to the industrial revolution. Indeed, by its strong connections to high technology, defence and consumer world as well as to big budget for frontier research the culture of science today has been radically transformed.
Take the case of Isaac Newton who could afford to wait for two decades for fully satisfying himself on the viability of the law of gravitation before publishing it. It is the case that he thought of the law during his anni mirabiles 1665-66, when he had move from Cambridge to his native village of Woolsthorpe to avoid the plague epidemic. He wished however to satisfy himself on one theoretical point and one observational point. The former was to prove his conjecture that a spherical body attracts as if all its mass were concentrated at the centre. The latter was to know the details of Moon's motion round the Earth. Only when he was fully satisfied on these points that he took up Edmund Halley's suggestion that all his work be published in the form of a book. So his great work the Principia was written in three parts during 1686-87.
Today's scientists are guided by different priorities dictated by the competition for grant money and peer recognition and must rush to the press even before they have completed their lab work. Or, going to the other extreme, they may be forced to keep it suppressed for reasons marked ‘classified’ or under patent protection. Both are symptoms of external pressure on how science is naturally expected to evolve.
The relaxed style of science continued from Newton's time almost up to the Second World War. The famous path breaking - and nucleus breaking - findings of Rutherford did not require expensive machinery. Skilled and imaginative workmanship and a good lab-cum-workshop were adequate to provide all the equipment needed by scientists. The pattern changed after the world war and organised science with bigger and bigger budgets and a highly competitive spirit became the order of the day.
I will not dwell at length on the current issues relating to organised science, weapons research, cloning and genetic engineering and the question of intellectual property rights versus free exchange of information between scientists. I shall only stress the fact that these issues are inevitable and have to be faced whether we like it or not. Because of the rapid growth of science and its technological applications, their impact on our daily life has multiplied enormously since Newton's time. The issues I just mentioned are symptomatic of this development. It is not fair to blame scientists only for their behaviour today; one has to see the entire social context in which they work. If the culture of science today is different from Newton's time it is because science today is developing in a social environment very different from that in the seventeenth century.
Nevertheless, I will end with highlighting a rather disturbing feature of modern scientific culture that is threatening to pose a threat to the freedom of quest that has guided the progress of science so on. Even in the relatively calm waters of a pure science like astronomy one sees its effect.
Big budgets have brought about a change in the culture of pure research today. The scenario is as follows. Most frontier research costs a lot of money. The original apparatus used by Rutherford to split the nucleus cost no more than a hundred pounds. Today's high-energy particle accelerators cost several billion dollars. In place of Galileo’s one-inch telescope made by him in his own workshop, today's space telescope is again a billion dollar, multi-institutional and multi-agency project. Any scientific experiment using such an expensive facility necessarily costs big money too. To secure funds for this research one must approach a funding agency...usually the government of the country. The funding agency allocates funds on recommendations of peer review committees. These committees guided no doubt by the need to allocate scarce funds in the “ best” possible way, look for impeccable credentials of the proposer and credibility of proposal. As a result of this scrutiny only the 'safe' that is, 'no risk' proposals selected. Safety is of course decided by the criterion that what is to be looked for should be consistent with what we already know. The review committees discourage explorations of wild untested ideas for fear of getting mired in cranky concepts.
While all this seems common sense, try to apply it to Galileo and Copernicus. If were a member of a peer review committee set up by the Church of Rome, then follow today's criteria, you would have no option but to reject their proposals! That we have turned a full circle from those days was illustrated by the episode of astronomer Halton C. Arp. I shall conclude with a brief account of that episode.
Arp is an experienced observational astronomer who was trained under Edwin Hubble, the astronomer credited with the discovery of the expansion of the universe. Hubble's law forms the foundation of modern cosmology. Arp himself has made discoveries to his credit and his atlas of peculiar galaxies is a standard source-book for modern observers. Yet, since the 1970's Arp has been finding several cases of galaxies and quasars, which don't fit into the Hubble expansion picture. These are known as anomalous observations. To understand their significance, let us look at Hubble's law.
Hubble's law states that the speed of recession of a galaxy that is the speed at which, it appears to move away from us increases in proportion to its distance. Thus if galaxy A is twice as far away from us than galaxy B then it, moves away from us at speed twice that of B. Likewise, if two galaxies are near neighbours, then their speed recession should be nearly equal.
A typical anomalous observation may show two objects, say galaxies lying very close to each other on the sky, with apparent velocities of recession very different from each other. That is, these galaxies are going away from us with significantly different velocities, even though they appear to be near neighbours. This result is in clear violation of Hubble's law.
The establishment reaction to this type of evidence ranges from disbelief anger to plain apathy...for to face it squarely would involve questioning or seriously modify the existing paradigm in cosmology. After Arp's early discoveries of such anomalous cases, he was told that these are perhaps some freak cases and may not be indicative of a real discrepancy. However, with time and more observations, Arp could produce more and more such examples. So a stage came in the early 1980s, when the leading observatories closed their doors to Arp, on the grounds that what he discovers from his observations does not make sense! Because, it does not fit into the existing paradigm. Today, the list of anomalous cases has grown and diversified into several different types all inconsistent with Hubble's law. There are a handful of workers, which has been steadily discovering such examples.
When accounts of such cases are submitted to research journals, they do not take kindly to publishing them. Scientists refuse to debate the issue in international conferences: they would sooner forget it altogether. (Recall the example of no records being kept of the Crab Supernova by the medieval monks because the event did not fit into their paradigm.) The message therefore is that in order to succeed in science you must learn to conform. No wonder, young research workers hesitate to enter this contentious field.
History of science has shown that anomalies are often significant, as they are nature's way of disclosing a new secret. It behoves true scientific spirit of exploration to go deeper into ouch cases. Either they may turn out to be spurious, in which case the existing paradigm is further strengthened; or they may reveal a hitherto unknown fact. However, by ignoring anomalous cases or positively discouraging their investigations, we may be closing door on something new. In such an environment how can new ideas prosper in science? This is the serious crisis faced by the current culture of science.
How the situation has changed over the past few decades is illustrated by the following anecdote. In an interview at the Tata institute of Fundamental Research the Nobel Laureate Subrahmanyan Chandrasekhar recalled that when the funding for the five metre telescope to be built on the Palomar Mountain was approved in the nineteen thirties, Edwin Hubble and the Cambridge theoretician Arthur Stanley Eddington gave a press conference. When asked, "What do you expect to find with this new telescope," their reply was "if we knew the answer, then was no reason for building it". This open-minded approach towards the unknown may be contrasted with any proposal for a new telescope in modern times. There the proposers have to spell out in advance, and in great detail, what they expect to find with the new instrument. Evidently the proposers have a theory in mind for which they seek further support from the instrument. Any finding even remotely contrary to that theory would naturally be discouraged.
Thus today the culture of science stands at crossroads. The example in astronomy is not an isolated one. Each branch of science will have some skeleton in its cupboard. The proposer in 1915 of the idea of continental drift, Alfred Wegener was ridiculed through out his life. Later in the 1950s the idea became acceptable and is considered well established today. The concept of panspermia was ridiculed by biologists on the ground that micro-organisms travelling in interstellar space would not survive the hostile extreme conditions. Today the idea is on its way back as bacteria are showing evidence of survival under similar conditions reproduced on Earth.
Of course there is a large number of cranky ideas floating around, claiming to be superior to the works of Newton, Einstein, Darwin, etc. A superficial examination of these reveals their hollowness. What I am referring to are serious scientific works like those I have just mentioned; works that are ignored just because they do not fit into the popular paradigm.
Normal scientific method dictates that anomalous
observational evidence or alternative theoretical models should be examined
thoroughly before being rejected or accepted, just as it recommends a fair
trial for any alternative to the standard hypothesis. The current sociology
does not permit this. Rather it prompts the adherents of the standard
paradigm to oppose any alternative that is seen as threat to its survival. I
have coined the name 'scientific fundamentalism' for such a closed attitude.
Will the culture of science emerge one day out of its influence?
On the nature of Interconnection between Science, Technology, Philosophy and Culture. Occasional Papers Project of History or Indian Science, Philosophy and Culture, 1991
Professor J. V. Narlikar is
Founder Director and
Inter-University Centre for Astronomy and Astrophysics, Pune 411007