Here are some links to password protected pages that don’t appear on the main blog roll. Password shared with people who need access to them.
An autobiographical piece about being born with a cataract, seeing double my whole life, and why I became a neuroscientist. Its the prologue to a book I am writing titled ‘An Emergent I’ (currently paused due to other projects).
‘Of the essentials of preserving life, nourishing the breath has no peer. When the breath is exhausted, the body dies, when the people are downtrodden, the nation collapses’
Zen Master Hakuin Ekaku, Letter to a sick monk, 18th century Japan
‘…To explicate the uses of the Brain, seems as difficult a task as to paint the Soul, of which it is commonly said, That it understands all things but itself;’
Thomas Willis, Preface to Cerebri Anatome, 1664
Prologue: A Semi-Monocular I
‘You never really understand a person until you consider things from their point of view… until you climb into [their] skin and walk around in it.’
Harper Lee, writing as Atticus Finch, in To Kill A Mockingbird
A Forbidden Experience
This is a book about the self. And since eyes are said to be the window to the soul, they make a good place for us to start. Eyes have also been a topic close to me since my birth. To explain this, I want to begin by telling you something about myself, about how I see the world, and thus something about the brain that wrote this book.
I was born blind in one eye. Because of this, there is an experience that I have never had. Not only have I never perceived this experience, I cannot even imagine it. It is an experience that you, and almost everybody you have ever met, takes entirely for granted. Yet it is also profoundly beautiful, and this missing beauty in my own life captivates me. I know it is beautiful from the accounts of the few who lacked this experience, and then gained it later in life.
This experience is that of the ability to see depth. This depth perception occurs through binocular vision, called stereopsis, and it is a central part of how the normally sighted see and interact with the world. We commonly refer to it as three-dimensional vision.
Stereopsis comes from our forward-facing eyes, and therefore only some animals, usually predators, have it. It allows you to assess how far away an object is from you. It is what allows you to sense how far a balloon in the air is from you. When you intuit how far to reach for a glass, the sense of depth is at work.
How do we sense depth? There are multiple ways. Firstly, there are cues for depth that come from individual eyes: things that are closer to you move more when you move your head, and obviously they look larger than when they are far away. These are monocular cues. We can see this from the depth apparent in flat, two-dimensional works of art. Leonardo da Vinci in his notebooks made a detailed study of how such monocular cues can be used to create depth in art. But despite their usefulness, these cues lack a great deal of information compared to normal binocular vision, which I lack. This other, binocular kind is the one that is so elusively, beautifully different.
Binocular vision cannot come from either eye alone. That is to say, it only comes from the combined use of two eyes as one. It is therefore in part a creation of the mind.
The additional information used to create a sense of depth is the slight difference in viewing position between the two eyes, which leads to different images being formed and sent to the brain. To see this, try holding an object close to you, then close either eye, and see how the image changes. How much it changes depends upon how far away it is, and this information is used by the brain to estimate distance. For example, an apple held close to you looks much more different eye to eye than an apple far away. The brain uses this information to create binocular depth.
Without this estimation of depth, the world has a flatness that is obviously hard for me to describe. My brain has only ever had relatively minimal depth cues as it uses only one eye, and so my worldview is correspondingly flat.
Interestingly, once stereopsis has been acquired, even closing one eye does not make it fully go away. Normal brains use their past binocular experience to ‘fill in’ and create an enriched sense of depth even when one eye is shut, a fact confirmed by studies showing that binocularists are better at judging depth with one eye than are monocularists. Even if you closed one eye, it would not give you this same flatness that I see. Susan Barry wrote a beautiful memoir of her own experience gaining normal stereopsis later in life, an experience so unusual she was profiled by Oliver Sacks in the New Yorker for it. She describes herself being ‘overwhelmed’, the experience being ‘remarkable’ and ‘dramatic’. It is a ‘distinct, subjective sensation’, quite unlike anything else. She writes of how even films on flat screens gained depth for her upon gaining stereopsis, and of how she felt beautifully immersed in a deep, enveloping sense of worldly depth that she had never experienced before. A particularly haunting part for me was reading of her joy at being surrounded by and immersed in a scene of falling snow in winter, being herself part of this deepened world for the first time. She had been freed from a life of living through a flat television screen as I must do.
In her book, Barry quotes Frederick Brock (1899-1972), a pioneer in visual therapy who wrote ‘….Before stereopsis is actually experienced by the patient, there is nothing one can do or say which will adequately explain to him the actual sensation experienced.’ Like a colour that has never been seen before, it is beyond the imagination of monocularists like myself to conjure these sensations. Despite having ‘normal’ brains, our imagination is sharply constrained by our lack of experience. As no explanation of sound could ever satisfy the deaf, no explanation of depth could satisfy me. It is the invisible thing lying forever in front of me.
Neither is this imperception merely a sensory deficiency. Rather, this lack of depth pervades action also. Living in this flat land leads to a certain clumsiness, and a lifetime of odd habits built up to compensate. At a dinner with a friend in 2018, I reached quickly for a wine glass, misjudging its distance badly. My hand crashed into the newly filled glass, splashing red wine all over his white shirt. My explanation involving monocularity, stereoblindness and resulting poor motor control did not resolve the subsequent awkwardness.
I was aware of this deficiency from a young age, and I channeled this curiosity into a tendency to think about thinking, in order to try to understand myself. This in turn led to questions about the brain.
The birth of an I, and a blooming, buzzing confusion
Let us start with a ‘simple’ question: how do we begin to see? When thought begins in life is perhaps not a meaningful question. Most likely to me at least, thought gradually appears in the womb: we are not desktop computers with simple ‘on’ switches. But what is clear is that the opening of the eyes for the first time marks a dramatic change. Babies do not hide this fact upon their emergence, which is rarely a quiet process. What is not so immediately clear from watching babies is the marked challenges that seeing and doing present to the newborn, though the flailing of newborn babies hints at this.
We imagine babies looking back at us clearly for the first time, but the reality is almost certainly quite different. Upon being born, we are met instead by what William James called ‘the blooming, buzzing confusion’. As we shall see, neuroscience has proven James correct. In the ‘great bath of birth’ a blaze of ever-changing lights storm into the fertile young mind. Everything is unexplained, the mind restlessly clutches for anything predictable, familiar. All sorts of regularities are identified and saved by the brain at this time, a process that we shall track later. Edges, shapes, textures, movements, all form a rising crescendo of complexity that is caught by the brain. From this apparent chaos we eventually learn to recognize faces, appreciate beauty and read poetry. Here the brain is forming the foundation upon which more detailed understanding will later be laid, a layer of perceptual lessons upon earlier perceptual lessons. In studying the brain, we are in large part studying the machinery for understanding this apparent chaos into which the newborn is so suddenly immersed. Brains thus exist to learn useful things about the world.
We will discuss at length the challenges of something so apparently effortless to us as seeing. For now, I want to elaborate on seeing with two eyes.
We do not come into the world able to take the two images from both of our two eyes and make one single, hopefully unified worldview. But, at least for you, it happened. There is one self. A single, internal model of the world built from two eyes, two ears, taste, smell, touch, and even sensors that tell you where your joints are. From the cacophony of newborn experience comes our mental universe. This is not a passive process either: every experience comes to hint at, suggest, or even force action, which in turn changes sensation, and so the cycle continues. From all of this comes a whole, single world.
Well, at least for you. For me, things are more complex. When I, my self, developed in the womb, a small layer of connective tissue remained in the right eye, blocking the light from entering and reaching the retina. I was thus half-blind.
Being half blind did come with an advantage: for the first five weeks of my life, my baby self had no binocular problem, no need to combine the two images: I looked out upon the world with a single eye. All of those worldly confusions of vision were dealt with by one eye’s input to the brain. Things were relatively simple, and I, whatever this I was at that early age, could focus on other priorities in setting up my inner world.
Then, however, everything visual in my life changed. This aberrant connective tissue was removed at a very young age – five weeks – for reasons that we shall see. The second eye, lacking a lens due to this surgery, suddenly came into play, and the confusion blossomed into what I imagine was mental anarchy. A second image began flooding my brain – much blurrier, weaker, unable to focus at all as it lacked a lens – and this image needed to be dealt with.
What happens next in babies such as I was? The ideal outcome, of course, is to have two eyes see as one, ie., to be normal. Here, though, there are problems. It is much easier for the brain to simply continue to use one eye, especially when that one eye has much clearer vision than the other. What happens then, if left untreated, is exactly this: the brain blocks the image from the second eye, and so people use one eye only. This is a common reason for having a squint. The brain never learns to use the weaker eye, and so that eye simply lives on as a kind of vestigial sensory artifact, an eye without a brain. The eye gets forgotten, if it was ever truly known.
To prevent this abandonment of the weaker eye, children like me go through a rather time-intensive and laborious process of ‘patching’ or, more technically, monocular occlusion therapy. In this process, a patch is placed over the ‘good eye’, as my mother used to call it, forcing the child to use the ‘bad eye’. This patch is often worn for the majority of each day, and I used to make frequent visits to Great Ormond street hospital in London, where they sought to track my progress. In childhood photographs, I can be seen wearing this patch, peering out at the world through the blurry, weaker eye. I wore it during almost all of my time at primary, or elementary, school. My mother went through parental hell trying to coax her child’s brain into seeing. I repeatedly tore off this cumbersome, sticky object, and had to be bribed with sweets to keep it on. It became a bargaining chip in many a negotiation. I yearned for the hour when I could rip the patch off and see a crystalline world again. Every day, for the first ten years of my life, I went through this. It did not endear me to the school bullies: pirate jokes abounded.
With my mother during monocular occlusion therapy. I am being forced to use my right eye to ‘teach’ my brain how to use it.
The purpose of this treatment is to make the brain learn to listen to the input from the ‘bad’ eye, and it can be effective. For many who go through this, the outcome is relatively normal binocular vision. Some people’s brains still reject the ‘bad’ eye, remaining monocular once the patch is off. They remain unable to use the bad eye at all. Regardless, the normal outcome is almost always one worldview, whether it be from just one eye, or from successfully using the two eyes together. A ‘single’, obvious I that sees and acts, a centre point of the self.
For me though, something rather different happened, a compromise between these two extremes. I ended up seeing everything twice.
A doubled life: ‘imperfect’ partners
The task of fusing two eyes’ images was never accomplished by me. Doubtless my baby brain tried to make sense of the images together, but gave up the struggle. Instead, I have suffered from double vision all of my life. As I write, I see two pages, two sets of words, blending into each other. When I look at the sky, I see two suns. I see two moons, two lover’s faces, two horizons, two of everything.
What is it then, to see twice? Which eye represents the experience of the self? Which image is reality? In truth, one image is more like a ghost, hovering over or under the reality. This ghost is almost always the bad eye: very blurred, legally blind, a little like looking through very frosted glass. I have been unable to wear contact lenses since an eye operation at age 25, but even when I could, I preferred not to. The blurriness helps me to keep those two images separate: one grounded, one floating. This is less confusing.
The relationship of the two images to one another is of particular interest. When I was a child, I almost always presented a ‘lazy eye’, my right eye would stubbornly not look straight, not look in alignment with the left eye. As I turned thirteen, this increasingly bothered me, because of how it looked to others.
I began to think long and hard about how to fix this problem of having an obviously lazy eye. I consulted doctors: there is a commonly used surgery that makes the eyes straight by subtly tightening or loosening muscles. However, this only works if the offset of the two eyes is constant. Mine was not. My right eye would drift in different directions, and so there was no fixed correction to make. Doctors said I was also too old to hope that my brain would learn to fuse these two dueling images from the two eyes(more on that duel later). The development of vision happens very young, they said, in a ‘critical period’ where the brain is especially malleable. Because of this, I would have to put up with being cross-eyed forever.
But I was stubborn. I persisted in trying to solve this problem without medical help. What was clear to me was that the fix, if it could be found, would not be found in my eyes, but in my brain. What I realized then was that the double vision was actually a secret blessing: it meant that if I concentrated I could tell when my eyes were looking at the same point or not. This is harder than it sounds, since I lacked the mental routines to do this check effortlessly, but over much time I learnt to do it through extensive trial and error, every time learning to make the relevant estimate of how far apart my eyes were. I learnt to tell whether the eyes were drifting or not, and thus whether I looked cross eyed.
I then had to learn to move each eye independently to bring them back into alignment. This does not come naturally to anyone, not even someone with a visual brain as strange as my own. I began searching for the right mental lever, the right thought, to allow me to control the eyes separately. Over time, I gradually found the way to do it. I explored all kinds of abstract mental places before I found the correct one, the one that attached to the ‘move right eye’ or the ‘move left eye’ commands. Over months, I gradually taught myself how to correct each of the relative displacements between the two eyes, including testing myself in front of mirrors. I used the errors, the difference between the expected and the actual outcomes, to slowly calibrate my brain’s ability to keep my eyes straight. As a teenager, I did consciously what you have done effortlessly since early childhood. The images never fused, but they were relatively aligned. Most of the time, unless I am tired or distracted, my eyes are aligned.
I had created in my mind a habit based upon the close coupling of action and sensation. Each little drift in the two images was met by the necessary corrective action. I had not learnt to use my eyes as one, but I had created an illusion of having done so. Even now, several times a minute I notice a drift of the images that is big enough to need a conscious correction, but it is doable.
Recompense for monocularity
I wrote earlier that I ‘suffer’ from double vision, but this is only partly true. It is true that it is most irritating trying to read with this constant, shifting sets of two images. It is also difficult to become immersed in something while maintaining the constant corrective habits of keeping the eyes aligned. Following slides in a lecture is particularly difficult. I find it very hard to look at objects in detail, to fix my gaze upon an object for any length of time. The constant habit of correcting the drift is immensely distracting, like trying to think whilst balancing on one leg.
In essence, I lack what the yoga masters called Drishti, an essential part of yoga involving steady, calm, yet concentrated gaze, that in turn settles the body and breath. I am in a constant juggling, shifting state of attention, unable to minutely examine precise detail. It makes eye contact especially odd, and I habitually avoid this discomfort.
But seeing the world differently in this way also has its advantages. I’ll begin with the beauty of the images I see. Because my ‘bad eye’ has no natural lens, the image it sees is heavily blurred. This means that points of light are seen as blazes of illumination, like car lights through a rainy window. That eye’s world resembles a living impressionist painting, with blurs of colours and poorly resolved shifts of shapes. I sometimes sit and simply look out of the right, weaker eye, seeing the world in this unfocussed way. It gives an odd reminder of how distorted normal perception is.
But the real beauty comes when using both eyes together, as best I can. Through this, viewing the world becomes a kind of internal poetry, an impressionist painting dancing above the crystal, lens-given ‘reality’ of the good eye. A dancer’s precise pirouettes are illuminated in a bright, motion-filled blur. Reflections of light each have their own shifting halo. Nights are especially beautiful: on top of the precise points of light, the sparkle of the weaker eye floats, broader and more diffuse, like a glow from within each object. Or perhaps this glow is underneath the clear image, not on top. It is hard, even impossible, to work out which is the foreground. This double vision is more irritating in day: the brightness of both images becomes more of a fight within my brain, whilst the calmer hues of night allow a complementary co-existence. The real world is bathed in the warm glow of the weaker eye.
But the benefits of seeing double extend beyond the aesthetic. How you see alters how you think. Studies have even shown that you can predict aspects of personality from the movement of the eyes. Changes in ways of seeing have been linked to great artistic ability: it has been speculated from self-portraits that Rembrandt, Picasso and even da Vinci had lazy eyes, and a corresponding flatness of worldview. This forced adaptation, compromise, a development of alternative ways of seeing and depicting the world.
But double vision did not help me become a brilliant painter. In fact, I’m terrible at drawing: trying to visualize things on the page is very difficult, as the two eyes constantly jostle, meaning there is no fixed point of focus to imagine from. This outweighs any artistic advantages I could gain from seeing a flat world.
Instead of leading me to art, it led me to thinking about thought. Seeing the two images competing, completing, complementing one another is to watch my own thought in real time. This is true for you also: the world you see is the inside of your head. When you open your eyes and gaze upon a sunset, you are actually watching your brain at work.
Your brain performs this illusion well. but in me it is far from perfect. When my attention changes between the two eyes, the images gain and shrink in relative vividness, even relative realness. I can see attention at work. As a child I used to sit and play with the two images, often in a futile attempt to achieve my dream of using two eyes as one.
That dream has not yet come true, and I increasingly wonder if it is such a dream at all. When I close my right eye, seeing with the good eye only, the world becomes still, static, even dull. It is like acquiring a newfound yet bland peace, like an ADHD sufferer suddenly finding calm and realizing they have nothing to do. Rather, my imperfect duet of the two eyes gives the world a living, breathing air, each glance birthing a new flurry of thoughts buzzing in this abstract, unfocussed theatre inside of my head, the theater that sees twice. Viewing can never be a quiet act for me. How much of my personality comes from this eccentric worldview I cannot tell, but I am inclined to be optimistic.
Whatever the answer, fate has given me at least one great gift. Through all of these moments, from long hours in hospitals taking tests, years wearing an eye patch to coax my brain into seeing, experiments in and frustrations with seeing double, a fascination with what I am and how I came to be was born, alongside the tiny speck of tissue that blocked the sight into my right eye. I became restlessly curious to understand this problem deep inside myself, a problem of myself and its partial doublement, and was drawn to reading and learning all that I could about brains. I went on to study neuroscience as an undergraduate, and then as a PhD student.
That is what this book was originally about: neuroscience, and the brain: the instrument that I thought makes the mind. But since then it has expanded, as my own life experience showed me my own embodiment. I will not give away the narrative of the book, but this is me in April 2017, in an intensive care bed following surgery on my aorta:
The author in an intensive card bed, April 22nd 2017.
This, and two other experiences around then, fundamentally changed my sense of self, my thoughts as a scientist, and ultimately this book, which has become as much a memoir of a scientist’s life as it is a book about the brain: my hopes, frustrations, thoughts, experiences, discoveries, friendships, training, successes, failures etc. The core parts of this book were written, at least in my head, during the build up and recovery from this operation. We shall come to this question of embodiment and the mind:body link.
Until then, to the fabrics of thought we go.
Note: this was written originally in March 2020 for a different blog.
You are probably just beginning a long period of isolation. I spent 2017 largely alone in my family home preparing for and recovering from a surgery to fix a serious illness, which ultimately took years to recover from fully. I thought the lessons I learnt might be useful to others now.
What I found is that ‘free time’ vanishes like a pile of sand gathered through open, welcoming hands. You only realise it is gone when a few grains are left. Twitter consumes.
Da Vinci’s words apply well: “Art breathes from constraint and suffocates from freedom.”. We may have dreams of how we will use our new found isolated freedom, but, if you are like me, your will will be captured by clickbait and the time will vanish without conscious countermeasures.
I suggest four simple habits and goals for the bunker:
- Create a virtual bunker: work via long-periods of deliberate isolation, not scattered, interruptible time (deep work).
- Create targets at different time scales, to provide milestones to move toward (distant stars) .
- Create a clear work/personal divide and ‘mindset changes’ via routines (organising the paintbrushes).
- Develop a non-work skill such as meditation to provide balance, an escape from work, and fulfillment.
Create a virtual bunker: deep work
‘Evil is whatever distracts’
We structure our bunker time for maximum quality of effort. The divide between ‘personal’ and ‘work’ is unclear, and so we shift between the two modes both incompletely and briefly.
The Deep Work phrase comes from Cal Newport’s book of the same name. Deep work is a sustained period of isolated concentration, without distracting influences. Newport argues that we have lost the ability to do this, instead spending our time consumed by the ‘shallows’ of email etc, and our productivity has quietly collapsed because of it.
Newport tells of famous (and non-famous) achievements made by deliberately cutting oneself off from the world for a specified period of time: no internet, no email, no phone calls, no social contact. Historically, Carl Jung worked in a tower for almost the entire day, cut off from distraction. Mark Twain had an isolated hut for writing, and would be summoned for dinner by a horn. Today distraction is far worse than then.
The essence of deep work is that an hour of continuous undistracted work is worth 3-4 hours of distracted, fragmented time, or of eight 15 minute blocks of undistracted work. It gets you ‘in the zone’, keeps your mind on task.
Crucial to deep work are two key things:
1 – Set aside deliberate blocks of time, say 2 hours, to work on something.
2 – Cut off all contact with the outside world during this time.
This may sound simple (it is), but when did you last do it? Set your day to be composed of around 1-4 blocks of deep work, depending upon length (when I met Newport he suggested that sustained intense deep work of more than 4 hours per day is challenging, which matches GH Hardy’s advice in ‘A mathematicians apology’). When distraction comes, say you are in ‘deep work’ – a polite f-off to ‘whatever distracts’.
In my experience, this is far more important and useful than expensive productivity apps. It also acts as a ‘free organisational tool’, forcing you to structure time and prioritise by picking your ‘deep work targets’ (see below).
To get a better sense of this, see this video which is an excellent summary of deep work by a youtube blogger who has interviewed Newport.
Something I have found very useful for ‘deep work’ is to play ‘coffee shop sounds’ – see this example (on spotify/apple music).
Two useful apps – freedom & self control. Freedom shuts the internet off, self control blocks specific sites. It will educate you about how quickly you reach for the twitter dopamine hit.
Note added 18th April 2021 – ‘E-mails ‘hurt IQ more than pot” – CNN
Create stars of varying distance: ‘compass-like’ targets at different timescales
“He whose gaze is fixed on a distant star will not falter.” Leonardo Da Vinci
It’s easy to lose sight of the distant star indoors, especially for the infinite labyrinth of the internet at our single-click-away disposal. Deep work without direction will lead to piles of semi-finished things. Time becomes amorphous without targets at different timescales.
This sensation only gets worse without active countermeasures. What I found in my bunker year was that without clear weekly targets, weeks blurred into one another, and nothing ever got ‘done’ – books were left half read, essays begun and half forgotten. I did many things, but never finished them: I was a slave to momentary wills. Weekly targets give you clear ‘weekends’.
Targets must be used to create a sense of timescale, a sense of time as a resource to be used. Targets act at different timescales. I was fortunate enough to be educated in the Oxford tutorial system. A ‘hidden routine’ in this system is the weekly essay, which gives you a target and a ‘focus’. The weekly essay gave you your ‘local purpose’, the ‘stepping stone’ to the exams and, more importantly, to being a scientist.
Without this weekly essay as a PhD student, I found myself wandering. I lacked the productivity of my undergraduate stays, spending much time dithering. I only realised what I was missing in the final year of my thesis: clear weekly targets, very few in number. This is a difficult thing to do as cleanly as when a mentor sets them, but it’s a skill I have since consciously worked on. In essence, this is about being mindful of the goals you have, rendering them explicit, rather than simply aiming for vague notions of productivity.
The Oxford term system also provides a mental scaffolding of time – I can still remember what I was doing each term of Oxford, and I used to be able to remember individuals week (3rd week), because they were placed within the structure of 3 terms with holidays. Each term is 8 weeks, with long vacations, meaning you have a total of 6 divisions to the year. It is likely we will be bunkered for a year on-and-off: divide that time up. I follow the oxford system now using an Oxford diary.
Likewise, without clear daily targets, it is easy to find yourself at lunch having done nothing but ‘deep work’ on whatever came up in the morning’s emails. Setting targets is a simple way of prioritising. For this, ZenHabit’s MIT’s is very useful: Zen Habits on MITs – identify 1-3 of the most important tasks for the day and structure your time around it. Note that having a long conversation with someone often is an MIT – today I tried to sneak a call in and ended up with so much to follow up on that I had to schedule a deep work session on it.
Your work time therefore becomes a hierarchy of targets.
Pick daily, weekly and termly targets, and remind yourself of them daily. I put the weekly targets in my diary so I see them everyday. Think in the future – where do I need to be?
A nice phrase is ‘a year is shorter than you think, 10 years is longer than you think’ – I read it on twitter some time ago and can’t find the source, but it is not original to me.
The painter tidies his paintbrushes: habits, routines, and the structuring of time
States of mind are something that we find ourselves at the mercy of in our isolation, as it is easy to get ‘stuck’ in a mindset without external cues to nudge us out of it, or even point out that we are in a maladptive mindset. We assume what 16th century zen master Takuan Soho calls right-mindedness, that our mind is free to deal with what it needs to deal with, rather than stopping in distraction. Yet our state of mind depends greatly on surroundings, and we usually only realise we have gone astray some time after it happens.
Monasteries use chanting, rituals, daily schedules to keep the monastic mind focussed upon its higher calling. They also use the physical environment, a topic for a later blog. Our work environment, for better or worse, sculpts this also. You may find, as I did, that the sudden removal of the cruxes and stimuli of our work mind reveal the very existence of those cruxes and stimuli. Freedom reveals the benefits of constraints. The trip to work in the morning tunes the mind for what’s to come. Seeing the boss reminds you of the importance of deadlines. Travelling home at night helps ‘switch off’ – somewhat reduced by the curse of chronic connectivity.
In the eastern arts, the frailty of will is recognised and deliberately honed. In Zen in the art of Archery, Herrigel tells of a master of painting preparing for and executing a painting in front of pupils:
“A painter seats himself before his pupils. He examines his brush and slowly makes it ready for use, carefully rubs ink, straightens the long strip of paper that lies before him on the mat, and finally, after lapsing for a while into profound concentration, in which he sits like one inviolable, he produces with rapid, absolutely sure strokes a picture which, capable of no further correction and needing none, serves the class as a model. A flower master begins the lesson by cautiously untying the bast which holds together the flowers and sprays of blossom, and laying it to one side carefully rolled up.
………But why doesn’t the teacher allow these preliminaries, unavoidable though they are, to be done by an experienced pupil? Does it lend wings to his visionary and plastic powers if he rubs the ink himself, if he unties the bast so elaborately instead of cutting it and carelessly throwing it away? And what impels him to repeat this process at every single lesson, and, with the same remorseless insistence, to make his pupils copy it without the least alteration? He sticks to this traditional custom because he knows from experience that the preparations for working put him simultaneously in the right frame of mind for creating. The meditative repose in which he performs them gives him that vital loosening and equability of all his powers, that collectedness and presence of mind, without which no right work can be done.”
I can try to distill this down into two simplest parts: first, have strict work hours, and non-work hours. Second, when you are to work, try to have a consistent place for it, and a routine to start/end the work. For me, it is to shut everything else down, remove all distractions for the task at hand, and set a timer. Then I think for several minutes about what is to come, which acts as a ‘work meditation’ akin to the organisation of the painter’s studio Herrigel describes.
I also recommend spending 25 minutes in the morning on a planning/tidying session. For example, today I took out my small notebook, reviewed my projects page of my notebook, and identified three priorities. I then simply thought about those priorities, visualising how i would do them, what I would need etc. When it came to do them, my mind was ‘ready’.
Deliberate rest is an important concept, relating to the divide between personal and work. Being on an iPhone looking on instagram is not rest, it is a reward seeking behaviour and stimulating. I strongly recommend setting strict ‘work hours’ and ‘personal hours’. Experiment in the benefits of cutting yourself off in the evenings, deep-work style.
Develop a skill: meditation, drawing etc
The single most important thing I did in my year of isolation, except for having a huge operation and eating lots of food to recover, was to begin meditating. The meditation that I focussed upon is Hakuin’s Nanso meditation, about which much of my book is centered, as well as more standard ‘clear the mind’ meditation.
An exceptional piece of writing on meditation comes from Dogen, effectively the father of the Soto school of Zen: ‘recommending zazen to all people’. I say more about this in the long-form version.
This is an article published in the Telegraph in June 2018 by myself and my brother, concerning UK national strategy in science and innovation. We called on the UK to ‘lead the future by creating it’. Below it is a comment and endorsement of it as ‘good advice’ by computing pioneer Alan Kay, whose phrase ‘create the future’ was an inspiration.
Science holds the key, June 7th 2018, Telegraph, James W. Phillips & Matthew G. Phillips
The 2008 crisis should have led us to reshape how our economy works. But a decade on, what has really changed? The public knows that the same attitude that got us into the previous economic crisis will not bring us long-term prosperity, yet there is little vision from our leaders of what the future should look like. Our politicians are sleeping, yet have no dreams. To solve this, we must change emphasis from creating “growth” to creating the future: the former is an inevitable product of the latter.
Britain used to create the future, and we must return to this role by turning to scientists and engineers. Science defined the last century by creating new industries. It will define this century too: robotics, clean energy, artificial intelligence, cures for disease and other unexpected advances lie in wait. The country that gives birth to these industries will lead the world, and yet we seem incapable of action.
So how can we create new industries quickly? A clue lies in a small number of institutes that produced a strikingly large number of key advances. Bell Labs produced much of the technology underlying computing. The Palo Alto Research Centre did the same for the internet. There are simple rules of thumb about how great science arises, embodied in such institutes. They provided ambitious long-term funding to scientists, avoided unnecessary bureaucracy and chased high-risk, high-reward projects.
Today, scientists spend much of their time completing paperwork. A culture of endless accountability has arisen out of a fear of misspending a single pound. We’ve seen examples of routine purchases of LEDs that cost under £10 having to go through a nine-step bureaucratic review process.
Scientists on the cusp of great breakthroughs can be slowed by years mired in review boards and waiting on a decision from on high. Their discoveries are thus made, and capitalised on, elsewhere. We waste money, miss patents, lose cures and drive talented scientists away to high-paid jobs. You don’t cure cancer with paperwork. Rather than invigilate every single decision, we should do spot checks retrospectively, as is done with tax returns.
A similar risk aversion is present in the science funding process. Many scientists are forced to specify years in advance what they intend to do, and spend their time continually applying for very short, small grants. However, it is the unexpected, the failures and the accidental, which are the inevitable cost and source of fruit in the scientific pursuit. It takes time, it takes long-term thinking, it takes flexibility. Peter Higgs, Nobel laureate who predicted the Higgs Boson, says he wouldn’t stand a chance of being funded today for lack of a track record. This leads scientists collectively to pursue incremental, low-risk, low-payoff work.
The current funding system is also top-down, prescriptive and homogenous, administered centrally from London. It is slow to respond to change and cut off from the real world.
We should return to funding university departments more directly, allowing more rapid, situation-aware decision-making of the kind present in start-ups, and create a diversity of funding systems. This is how the best research facilities in history operated, yet we do not learn their key lesson: that science cannot be managed by central edict, but flourishes through independent inquiry.
While Britain built much of modern science, today it neglects it, lagging behind other comparable nations in funding, and instead prioritising a financial industry prone to blowing up. Consider that we spent more money bailing out the banks in a single year than we have on science in the entirety of history.
We scarcely pause to consider the difference in return on investment. Rather than prop up old industries, we should invest in world-leading research institutes with a specific emphasis on high-risk, high-payoff research.
Those who say this is not government’s role fail the test of history. Much great science has come from government investment in times of crisis. Without Nasa, there would be no SpaceX. These government investments were used to provide a long-term, transformative vision on a scale that cannot be achieved through private investment alone – especially where there is a high risk of failure but high reward in success. The payoff of previous investments was enormous, so why not replicate the defence funding agencies that led to them with peacetime civilian equivalents?
In order to be the nation where new discoveries are made, we must take decisive steps to make the UK a magnet for talented young scientists.
However, a recent report on ensuring a successful UK research endeavour scarcely mentioned young scientists at all. An increased focus on this goal, alongside simple steps like long-term funding and guaranteed work visas for their spouses, would go a long way. In short, we should be to scientific innovation what we are to finance: a highly connected nerve centre for the global economy.
The political candidate that can leverage a pro-science platform to combine economic stimulus with the reality of economic pragmatism will transform the UK. We should lead the future by creating it.
Comment from Alan Kay:
Good advice! However, I’m afraid that currently in the US there is nothing like the fabled Bell Labs or ARPA-PARC funding, at least in computing where I’m most aware of what is and is not happening (I’m the “Alan Kay” of the famous quote).
It is possible that things were still better a few years ago in the US than in the UK (I live in London half the year and in Los Angeles the other half). But I have some reasons to doubt. Since the new “president”, the US does not even have a science advisor, nor is there any sign of desire for one.
A visit to the classic Bell Labs of its heyday would reveal many things. One of the simplest was a sign posted randomly around: “Either do something very useful, or very beautiful”. Funders today won’t fund the second at all, and are afraid to fund at the risk level needed for the first.
It is difficult to sum up ARPA-PARC, but one interesting perspective on this kind of funding was that it was both long range and stratospherically visionary, and part of the vision was that good results included “better problems” (i.e. “problem finding” was highly valued and funded well) and good results included “good people” (i.e. long range funding should also create the next generations of researchers). in fact, virtually all of the researchers at Xerox PARC had their degrees funded by ARPA, they were “research results” who were able to get better research results.
Since the “D” was put on ARPA in the early 70s, it was then not able to do what it did in the 60s. NSF in the US never did this kind of funding. I spent quite a lot of time on some of the NSF Advisory Boards and it was pretty much impossible to bridge the gap between what was actually needed and the difficulties the Foundation has with congressional oversight (and some of the stipulations of their mission).
Bob Noyce (one of the founders of Intel) used to say “Wealth is created by Scientists, Engineers and Artists, everyone else just moves it around”.
Einstein said “We cannot solve important problems of the world using the same level of thinking we used to create them”.
A nice phrase by Vi Hart is “We must insure human wisdom exceeds human power”.
To make it to the 22nd century at all, and especially in better shape than we are now, we need to heed all three of these sayings, and support them as the civilization we are sometimes trying to become. It’s the only context in which “The best way to predict the future is to invent it” makes any useful sense.