What about autism making you feel depressed?

Unfocused, unimaginative,
depressed

Our life on this planet has improved amazingly over the past century. Today, on average, we are healthier, wealthier and less violent. We live longer too. Despite these unprecedented changes, there are many indications that we are in a crisis that we have not yet fully realized, even though it lurks just beneath the surface of our everyday conversations and news feeds. I mean a crisis that threatens our future: a crisis of the mind. A cognitive crisis. As such, it touches the core of what makes us human: the dynamic interplay between the brain and the environment - this constant cycle in which we perceive our surroundings, integrate information and act on it.

Shortly

The ubiquity of information technology today has a massive negative impact on our cognition.

Depression, anxiety and attention disorders in children have risen sharply, creative thinking and empathetic interest have decreased.

In the future, we can also use technology and AI in such a way that they support our brain functions and improve our cognition. (lc)

Hundreds of millions of people around the world today seek medical help for severe cognitive impairments such as depression, anxiety, schizophrenia, autism, post-traumatic stress disorder, obsessive-compulsive disorder, ADHD, or dementia. In the United States alone, 16.2 million adults suffer from depression, 18.7 million from anxiety disorders and another 5.7 million from dementia - a number that is expected to triple in the coming decades. And despite significant investments in research and treatment for these diseases, there are more and more people affected. The number of patients with depression and anxiety rose by 18.4 and 14.9 percent, respectively, between 2005 and 2015, while that of dementia patients increased by 93 percent.

To some extent, these trends reflect the growth and aging of the world's population. Because while living longer has obvious benefits, one of the negative consequences is the impairment of many facets of our cognition. However, there are increasing signs that something else is wrong - especially when it comes to our offspring's emotional self-regulation and ability to concentrate: Depressive illnesses have increased by 33 percent in American adolescents within a few years; died than in 2010.

«Ultimately we should face the knowledge

that our brains with the rapid change in our living environment

just didn't keep up. "

Attention disorders have also increased dramatically. While a growing awareness of symptoms and triggers - and thus more frequent diagnoses - may play a role, the extent of this escalation testifies to a deeper problem. It is obvious that this is a global crisis: More than half a billion people worldwide suffer from cognitive impairments of this kind. Not least, they are linked to trillions of dollars in treatment costs and lost productivity. And even if subclinical deficits do not lead to a diagnosis, they pose a real risk in the areas of attention, emotional regulation and memory. Creative thinking and empathic interest are declining in children and adolescents, even the so-called Flynn effect, which is increasing worldwide IQ in the last century shows signs of stagnation - sometimes even a reversal - in industrialized countries.

There are many individual factors that negatively influence our cognition, but ultimately we should face the realization that our brain has simply not kept pace with the rapid changes in our living environment. This particularly applies to the ubiquity of information technology. We humans are essentially information-hungry living beings - a profound change in the flow of information will inevitably have a major impact on us. And as we must see today, many of them are negative.

One mind, many facets

Although the various aspects of our cognition such as memory, attention, perception or regulation of emotions differ on the surface, we now know that cognitive impairments are symptoms of a common basic problem: dysfunctions of the prefrontal cortex have been linked to the symptoms of almost every neuropsychological disease. Neuroscientists and medical professionals have also shown that the various parts of our cognition are much more closely related than previously thought. Attention deficit, for example, is now listed as one of the most important characteristics of depression in diagnostic manuals. The reality is: each of us has a single indivisible mind, and if we want to nourish and nurture it, we must finally acknowledge that.
As already mentioned, there is an aggravating factor that unfolds its effect across all aspects of our cognition: the drastic transition into the information age that we have undergone in the course of the digital revolution. Technology has radically changed the way we interact with our surroundings, with each other, and with ourselves. The environment in which our cognition once evolved has disappeared; the new one, in which information reaches us on countless channels and with ever greater speed, challenges our brain and our behavior on an elementary level. This can be seen in the laboratory when the effects of stimulus and information overload are documented. And it shows up in the real world as well, where we find a high correlation between the increased use of technology and the increase in depression, anxiety, suicides and attention deficits - especially in children.

"There is cause for concern, but it is not all disaster."

Although the exact mechanism is still being researched, a complex story is emerging: shortened reward cycles, a lack of tolerance for delays in the satisfaction of needs, and attention deficits can be observed. Information overload has been linked to stress, depression and anxiety, multitasking has been linked to safety risks (such as texting while driving) and a lack of focus that has a negative impact on relationships, education and work life. In addition, the constant preoccupation with technology leaves us with less time for things that are important for our mental health, such as being in nature, sufficient exercise, personal contacts and restful sleep. The consequences for empathy, cooperation and interpersonal relationships are only just beginning to be understood.

So there is cause for concern, but it is not all mischief. The information age has given us countless opportunities to expand our awareness and connect like never before. Fortunately, the negative effects of information technology are increasingly being recognized by both the entrepreneurs who created it and its avid users. Novel approaches try to help us develop healthier habits in handling software and devices so that we ourselves have control over these technologies and not the other way around. We should think about consuming information in a similar way to thinking about consuming food. However, behavior change alone will not be enough. And the stakes will increase even more if we sink into virtual or expanded realities and let our interactions be guided by artificial intelligences.

So we need to develop devices and applications that are shaped by a deeper understanding of how our brains work. And of its limits and weak points, because we cannot get the spirit of this technology back into the bottle.
So what should we do?
We have to improve cognition ourselves!

The cognitive challenge

We need better minds to cope with the flood of information that we ingest on the internet, on social media or on smartphones - and also about the technologies that we are sure to invent. In order to thrive in this new environment, we need to increase the maturity of our collective consciousness. This requires the coordinated efforts of political actors and national health institutes up to the UN and the powerful in Davos. Indeed, managing the cognitive crisis should be viewed as a challenge that is in line with other global priorities, such as the eradication of infectious diseases and access to clean water. And in order to cope with these challenges, we must have the necessary mental capacities. So the idea of ​​cognitive self-improvement shouldn't seem strange. We humans have long been obsessed with biological self-optimization. As for the body, strength, endurance, speed, balance, flexibility, and coordination have been purposefully improved with the help of specialized technologies and programs provided by trained practitioners. The same is fatally lacking when it comes to optimizing our cognition. And the price for this neglect can hardly be overestimated. Promoting cognition for “normally” functioning brains should be a core task of our educational institutions and the remedying of malfunctions a main goal of the medical one. None of the established providers is currently doing that. From teachers and therapists to psychiatrists and neurologists, our cognitive practitioners simply aren't armed with the tools or training they would need to support our brains.

Where are we today

Five specific shortcomings in our education and health systems prevent us from currently being able to cope with the cognitive crisis. First of all, our test tools for cognitive skills and deficits are outdated and inadequate. We also still don't have targeted treatments for attention and memory disorders, depression and anxiety. In addition, there is a lack of personalized treatments that cater to the individual patient. It is also very problematic that medical and educational approaches are often isolated and independent of one another. And finally: We use systems with an open control loop - that is, there is a lack of quantitative feedback in real time, thanks to which treatments can be dynamically adjusted. This is the daunting reality of all patients suffering from neurological and psychiatric illnesses. Advances in functional imaging, which have made our brain and its cognitive functions more understandable in the laboratory, have so far not been used therapeutically to shed light on individual cases.

«Our test tools for cognitive skills and deficits

are out of date and inadequate. "

When a clinical provider detects cognitive deficits in a patient, they prescribe drugs to treat their symptoms. Unfortunately, today's drugs in the medical toolbox are blunt instruments that influence neurotransmitter systems and not specifically the neuronal networks on which the problem is based. In the case of some diseases, these drugs can have a life-saving effect - but because they do not reach specific networks in the brain or change the pathology of a disease in the near future, these treatments remain very imprecise and therefore have side effects. There is nothing conceptually wrong with the idea of ​​using a single molecule to improve our cognition, but pharmaceutical treatments have not made significant advances in this area for decades.
The use of imprecise medical history tools is now dominating the world of mental illness. They are coupled with barely targeted open-loop treatments across the full range of disorders: major depression, post-traumatic stress disorder, anxiety disorder, ADHD, autism, dyslexia, traumatic brain injury, Alzheimer's disease, Parkinson's disease - all of them. The scenario becomes even more complicated when cognitive challenges arise in children due to the same limitations. In our school system, children take exams that assess how well they reflect recently recorded information. However, how successfully they can control their attention and emotions is rarely considered before a learning disability is suspected.

What does the future hold?

This is the perfect opportunity to let the same technologies that helped create the cognitive crisis improve exactly what makes us human. Mobile devices are now being developed that operate a wide range of highly developed sensors: touchscreens, accelerometers, GPS, voice recognition, heart rate trackers, detectors for facial expressions, eye movements or brain activity (e.g. EEG). They can be used to passively as well as actively collect and evaluate data about us. And this technology is ideally positioned to serve as the foundation for the next generation of cognition tests - those that will enable us to better understand ourselves in the real world and in real time.
Recordings of this kind could allow us a much more differentiated perspective on our abilities, for example with regard to which facets of perception are stable properties and which physiological states that change dynamically with our environment.

This approach should be pursued with great ethical care in order to protect sensitive data and to understand and prevent misuse. Knowledge of ourselves will also be linked to the inevitable effort to overcome ingrained prejudices about our cognition. Some will have concerns about tracking attention, memory, and decision-making that they don't with cholesterol, glucose, or blood pressure levels. This has a lot to do with the stigmatization of mental illness, as it reflects the “quality” of a personality and not a purely medical condition. There seems to be a natural tendency to see cognitive rather than other aspects of our biological functioning as a mirror of "who" we are. For example, we can say of someone that they are inattentive or have high blood pressure - the former is something that defines that person and is often associated with moral judgment, while the latter is simply happening to them and is considered a simple biological fact. These prejudices need to go away.

When we understand our cognition in detail, the next goal is to improve it. Before we get into the techniques that can help us do this, however, it is important to understand what we can already achieve through everyday activities. Extensive studies have shown that sport, intellectual challenges, interpersonal contacts, sleep, nutrition, making music, dancing and being in nature can do a lot of good. And some of the oldest formalized practices people follow are at their core cognitive-enhancing exercises based on mindfulness and contemplation. The positive effects of meditation on our mood and attention, compassion and stress management are medically proven. For far too long, wellness and medicine have been treated as separate disciplines, with the health system essentially being a disease system. When science works out the benefits of cognitive enhancement approaches, we will finally break the barriers that prevent advances in preventive treatment.

The challenge now is to find out how, with the help of technology, we can have experiences that make maximum use of the plasticity of our brain, improve our perception, refine our behavior and ultimately sharpen our minds. Of course, not all experiences are created equal. The most effective way to achieve this ambitious goal is through closed-loop control. They are currently used in many of our physical applications. Even household appliances like thermostats or tumble dryers use them when temperature and humidity measurements are used to determine how much heat needs to be added. In biological applications, however, they are almost non-existent. And, as noted above, both education and healthcare today use open-loop systems.

Thanks to technological approaches with closed-loop control, experiences can be designed that activate very specific neural networks and exert constant pressure on them via interactive challenges. This increases the plasticity of the brain and optimizes selected functions over time.Sounds abstract? Is very specific: Imagine you are playing a video game in which sensors are used to collect information about your body - in the form of performance metrics, emotional reactions, body movements and brain activity. This data is used in real time to adapt the game environment to you and to personalize challenges and rewards in such a way that your cognition improves. That would be like a one-on-one session with the ultimate personal cognitive trainer! Many laboratories and companies around the world are currently actively pursuing this vision. This includes my own efforts in technology incubation and research. Non-invasive, affordable, safe, and accessible technology - such as smartphones, tablets, wearable physiological devices, motion detectors, and interactive media - can be used to understand our minds and help us overcome the cognitive crisis.

«What better use could there be for AI?

than improving MI - human intelligence? "

Now go one step further and imagine the role innovations in artificial intelligence and virtual reality technology could play. For example in the form of multisensory virtual environments, in which the interactions of your entire body would be guided by an artificial intelligence that grasps you at this moment better than any human being would be able to - including yourself. The AI ​​would create an optimal closed experience, which aims to permanently improve all aspects of your perception and to keep them at a high level throughout your lifetime. It would induce subtle changes in mood, aggressiveness, attention, and memory by increasing the natural plasticity of the brain. Not to control you, but on the contrary to put control of your own mind into your hands and to prevent (or at least delay) your slide into major depression, anxiety, ADHD and dementia.
What better use could there be for AI than enhancing MI - human intelligence? If we are creative and thoughtful, we will deliver on the ultimate promise of technology: to create an environment that stimulates the next phase in the evolution of the human mind.

Medical advances over the past hundred years have resulted in improvements in our health far beyond what has ever been achieved in the past. And technology was an important part of that success. In order for our species to continue to thrive in this increasingly complex world, we must look inward and carefully and honestly look at the rifts we see there. Crises are times when important decisions have to be made to avert future disasters. The time has come for our brains and minds: human cognition is in trouble - and it's deteriorating, especially for our children. For far too long we have maintained the illusion of being separate from our environment. Now is the time to reflect on what it means for us to be human.

This article was first published on the “Medium” online platform. Here it appears for the first time in German.