This Is What Happens to the Brain When You Give Up Sugar for Lent

Summary: Decided to cut back on sugary treats? A new article considers why it might not be such a sweet idea for all.

Source: The Conversation.

Anyone who knows me also knows that I have a huge sweet tooth. I always have. My friend and fellow graduate student Andrew is equally afflicted, and living in Hershey, Pennsylvania – the “Chocolate Capital of the World” – doesn’t help either of us.

But Andrew is braver than I am. Last year, he gave up sweets for Lent. I can’t say that I’m following in his footsteps this year, but if you are abstaining from sweets for Lent this year, here’s what you can expect over the next 40 days.

Sugar: natural reward, unnatural fix

In neuroscience, food is something we call a “natural reward.” In order for us to survive as a species, things like eating, having sex and nurturing others must be pleasurable to the brain so that these behaviours are reinforced and repeated.

Evolution has resulted in the mesolimbic pathway, a brain system that deciphers these natural rewards for us. When we do something pleasurable, a bundle of neurons called the ventral tegmental area uses the neurotransmitter dopamine to signal to a part of the brain called the nucleus accumbens. The connection between the nucleus accumbens and our prefrontal cortex dictates our motor movement, such as deciding whether or not to taking another bite of that delicious chocolate cake. The prefrontal cortex also activates hormones that tell our body: “Hey, this cake is really good. And I’m going to remember that for the future.”

Not all foods are equally rewarding, of course. Most of us prefer sweets over sour and bitter foods because, evolutionarily, our mesolimbic pathway reinforces that sweet things provide a healthy source of carbohydrates for our bodies. When our ancestors went scavenging for berries, for example, sour meant “not yet ripe,” while bitter meant “alert – poison!”

Fruit is one thing, but modern diets have taken on a life of their own. A decade ago, it was estimated that the average American consumed 22 teaspoons of added sugar per day, amounting to an extra 350 calories; it may well have risen since then. A few months ago, one expert suggested that the average Briton consumes 238 teaspoons of sugar each week.

Today, with convenience more important than ever in our food selections, it’s almost impossible to come across processed and prepared foods that don’t have added sugars for flavour, preservation, or both.

These added sugars are sneaky – and unbeknown to many of us, we’ve become hooked. In ways that drugs of abuse – such as nicotine, cocaine and heroin – hijack the brain’s reward pathway and make users dependent, increasing neuro-chemical and behavioural evidence suggests that sugar is addictive in the same way, too.

Sugar addiction is real

“The first few days are a little rough,” Andrew told me about his sugar-free adventure last year. “It almost feels like you’re detoxing from drugs. I found myself eating a lot of carbs to compensate for the lack of sugar.”

There are four major components of addiction: bingeing, withdrawal, craving, and cross-sensitisation (the notion that one addictive substance predisposes someone to becoming addicted to another). All of these components have been observed in animal models of addiction – for sugar, as well as drugs of abuse.

A typical experiment goes like this: rats are deprived of food for 12 hours each day, then given 12 hours of access to a sugary solution and regular chow. After a month of following this daily pattern, rats display behaviours similar to those on drugs of abuse. They’ll binge on the sugar solution in a short period of time, much more than their regular food. They also show signs of anxiety and depression during the food deprivation period. Many sugar-treated rats who are later exposed to drugs, such as cocaine and opiates, demonstrate dependent behaviours towards the drugs compared to rats who did not consume sugar beforehand.

Like drugs, sugar spikes dopamine release in the nucleus accumbens. Over the long term, regular sugar consumption actually changes the gene expression and availability of dopamine receptors in both the midbrain and frontal cortex. Specifically, sugar increases the concentration of a type of excitatory receptor called D1, but decreases another receptor type called D2, which is inhibitory. Regular sugar consumption also inhibits the action of the dopamine transporter, a protein which pumps dopamine out of the synapse and back into the neuron after firing.

Image shows candy sprinkles on a woman's lips.

There are four major components of addiction: bingeing, withdrawal, craving, and cross-sensitisation (the notion that one addictive substance predisposes someone to becoming addicted to another). All of these components have been observed in animal models of addiction – for sugar, as well as drugs of abuse. NeuroscienceNews.com image is adapted from the The Conversation article.

In short, this means that repeated access to sugar over time leads to prolonged dopamine signalling, greater excitation of the brain’s reward pathways and a need for even more sugar to activate all of the midbrain dopamine receptors like before. The brain becomes tolerant to sugar – and more is needed to attain the same “sugar high.”

Sugar withdrawal is also real

Although these studies were conducted in rodents, it’s not far-fetched to say that the same primitive processes are occurring in the human brain, too. “The cravings never stopped, [but that was] probably psychological,” Andrew told me. “But it got easier after the first week or so.”

In a 2002 study by Carlo Colantuoni and colleagues of Princeton University, rats who had undergone a typical sugar dependence protocol then underwent “sugar withdrawal.” This was facilitated by either food deprivation or treatment with naloxone, a drug used for treating opiate addiction which binds to receptors in the brain’s reward system. Both withdrawal methods led to physical problems, including teeth chattering, paw tremors, and head shaking. Naloxone treatment also appeared to make the rats more anxious, as they spent less time on an elevated apparatus that lacked walls on either side.

Similar withdrawal experiments by others also report behaviour similar to depression in tasks such as the forced swim test. Rats in sugar withdrawal are more likely to show passive behaviours (like floating) than active behaviours (like trying to escape) when placed in water, suggesting feelings of helplessness.

A new study published by Victor Mangabeira and colleagues in this month’s Physiology & Behavior reports that sugar withdrawal is also linked to impulsive behaviour. Initially, rats were trained to receive water by pushing a lever. After training, the animals returned to their home cages and had access to a sugar solution and water, or just water alone. After 30 days, when rats were again given the opportunity to press a lever for water, those who had become dependent on sugar pressed the lever significantly more times than control animals, suggesting impulsive behaviour.

These are extreme experiments, of course. We humans aren’t depriving ourselves of food for 12 hours and then allowing ourselves to binge on soda and doughnuts at the end of the day. But these rodent studies certainly give us insight into the neuro-chemical underpinnings of sugar dependence, withdrawal, and behaviour.

Through decades of diet programmes and best-selling books, we’ve toyed with the notion of “sugar addiction” for a long time. There are accounts of those in “sugar withdrawal” describing food cravings, which can trigger relapse and impulsive eating. There are also countless articles and books about the boundless energy and new-found happiness in those who have sworn off sugar for good. But despite the ubiquity of sugar in our diets, the notion of sugar addiction is still a rather taboo topic.

Are you still motivated to give up sugar for Lent? You might wonder how long it will take until you’re free of cravings and side-effects, but there’s no answer – everyone is different and no human studies have been done on this. But after 40 days, it’s clear that Andrew had overcome the worst, likely even reversing some of his altered dopamine signalling. “I remember eating my first sweet and thinking it was too sweet,” he said. “I had to rebuild my tolerance.”

And as regulars of a local bakery in Hershey – I can assure you, readers, that he has done just that.

ABOUT THIS NEUROSCIENCE RESEARCH ARTICLE

Source: Jordan Gaines LewisThe Conversation
Image Source: NeuroscienceNews.com image is adapted from the The Conversation article.

CITE THIS NEUROSCIENCENEWS.COM ARTICLE
The Conversation “Not So Sweet? This Is What Happens to the Brain When You Give Up Sugar for Lent.” NeuroscienceNews. NeuroscienceNews, 4 March 2017.
<http://neurosciencenews.com/sugar-brain-neuroscience-6199/>.
FEEL FREE TO SHARE THIS NEUROSCIENCE NEWS.

    Life 3.0 Artificial emotional intelligence

  • BuilderAll Internet Marketing Platform - 2 Tier Commercial License The BuilderAll White Label is much more than a "business in a box". By mastering our tools you will be able to offer complete developmental and internet marketing services to your clients, at a price you set yourself.
  • Personal Transformation Mastery MRR Discover The Complete 10-Part Step-By-Step Plan To Transform Your Life And Become a Better You. Finally! Create a Meaningful Life, Master Your Brain, Overcome Your Fears, Remove Self-Doubt, Build Confidence and Much More!

Hierarchies exist in the brain because of lower connection time,

Pecking orders exist in the mind as a result of reduced link expenses, research study reveals Searchings for might likewise enhance expert system as well as robotics systems

The Evolutionary Origins of Hierarchy: Evolution with performance-only choice causes non-hierarchical and also non-modular networks, which take longer to adjust to brand-new settings. Nonetheless, progressing connect with a link expense develop ordered as well as functionally modular networks that could address the general trouble by recursively resolving its sub-problems. These networks likewise adjust to brand-new atmospheres much faster. (credit scores: Henok Mengistu et al./ PLOS Comp. Biography)

New study recommends why the human mind and also various other organic networks show an ordered framework, as well as the research might boost efforts to produce expert system.

The research study, by scientists from the University of Wyoming as well as the French Institute for Research in Computer Science as well as Automation (INRIA, in France), shows that the development of power structure– a basic system of position– in organic networks might develop as a result of the expenses related to network links.

This research additionally sustains Ray Kurzweil’s concept of the ordered framework of the neocortex, offered in his 2012 publication, How to Create a Mind. The human mind has different locations for vision, electric motor control, as well as responsive handling, for instance, and also each of these locations include sub-regions that regulate various components of the body.

Transformative stress to decrease the number as well as price of links

The research study searchings for recommend that pecking order progresses not due to the fact that it creates much more reliable networks, however rather since hierarchically wired networks have less links. That’s since links in organic networks are costly– they need to be developed, kept, and so on– so there’s a transformative stress to minimize the variety of links.

Along with clarifying the introduction of power structure throughout the numerous domain names where it shows up, these searchings for could additionally speed up future research study right into progressing much more intricate, smart computational minds in the areas of expert system as well as robotics.

The research study, led by Henok S. Mengistu, is defined in an open-access paper in PLOS Computational Biology. The scientists additionally substitute the development of computational mind versions, referred to as fabricated semantic networks, both with as well as without an expense for network links. They located that ordered frameworks arise far more often when a price for links exists.

Apart from clarifying why organic networks are ordered, the study could additionally discuss why numerous manufactured systems such as the Internet and also roadway systems are additionally ordered. “The following action is to harness and also integrate this expertise to advance massive, structurally arranged networks in the hopes of producing far better expert system as well as boosting our understanding of the development of pet knowledge, including our very own,” inning accordance with the scientists.


Abstract of The Evolutionary Origins of Hierarchy

Ordered company– the recursive structure of sub-modules– is common in organic networks, consisting of neural, metabolic, environmental, as well as hereditary regulative networks, and also in human-made systems, such as big companies and also the Internet. To this day, the majority of study on pecking order in networks has actually been restricted to evaluating this residential or commercial property. Nevertheless, an open, essential concern in transformative biology is why ordered company advances to begin with. It has actually just recently been revealed that modularity develops due to the visibility of an expense for network links. Below we examine whether such link expenses additionally have the tendency to trigger an ordered company of such components. In computational simulations, we locate that networks without a link price do not advance to be ordered, also when the job has an ordered framework. Nevertheless, with a link price, networks progress to be both modular as well as ordered, and also these networks show greater total efficiency as well as evolvability (i.e. quicker adjustment to brand-new settings). Added evaluations verify that pecking order individually enhances flexibility after managing for modularity. On the whole, our outcomes recommend that the exact same pressure– the expense of links– advertises the advancement of both pecking order as well as modularity which these homes are essential motorists of network efficiency and also versatility. Along with clarifying the introduction of power structure throughout the numerous domain names where it shows up, these searchings for will certainly additionally speed up future study right into developing extra intricate, smart computational minds in the areas of expert system and also robotics.

[email protected]!.?.!

    Life 3.0 Artificial emotional intelligence

  • Seo Profiteer Seo Profiteer - The One And Only High PR Network That Delivers! Clients Increase Their Rankings and Traffic!
  • iPad App Cash Earn easy income in the very lucrative iPad apps market. Now you can cash in on the ipad app gold rush without needing to know anything about programming. Finally, an affordable step by step course for technical dummies revealing hot to create and sell hot

Meet the scientists building digital ‘brains’ for your phone

The future of AI is neuromorphic. Meet the scientists building digital ‘brains’ for your phone

Neuromorphic chips are being designed to specifically mimic the human brain – and they could soon replace CPUs


BRAIN ACTIVITY MAP
Neuroscape Lab

AI services like Apple’s Siri and others operate by sending your queries to faraway data centers, which send back responses. The reason they rely on cloud-based computing is that today’s electronics don’t come with enough computing power to run the processing-heavy algorithms needed for machine learning. The typical CPUs most smartphones use could never handle a system like Siri on the device. But Dr. Chris Eliasmith, a theoretical neuroscientist and co-CEO of Canadian AI startup Applied Brain Research, is confident that a new type of chip is about to change that.

“Many have suggested Moore’s law is ending and that means we won’t get ‘more compute’ cheaper using the same methods,” Eliasmith says. He’s betting on the proliferation of ‘neuromorphics’ — a type of computer chip that is not yet widely known but already being developed by several major chip makers.

Traditional CPUs process instructions based on “clocked time” – information is transmitted at regular intervals, as if managed by a metronome. By packing in digital equivalents of neurons, neuromorphics communicate in parallel (and without the rigidity of clocked time) using “spikes” – bursts of electric current that can be sent whenever needed. Just like our own brains, the chip’s neurons communicate by processing incoming flows of electricity – each neuron able to determine from the incoming spike whether to send current out to the next neuron.

What makes this a big deal is that these chips require far less power to process AI algorithms. For example, one neuromorphic chip made by IBM contains five times as many transistors as a standard Intel processor, yet consumes only 70 milliwatts of power. An Intel processor would use anywhere from 35 to 140 watts, or up to 2000 times more power.

Eliasmith points out that neuromorphics aren’t new and that their designs have been around since the 80s. Back then, however, the designs required specific algorithms be baked directly into the chip. That meant you’d need one chip for detecting motion, and a different one for detecting sound. None of the chips acted as a general processor in the way that our own cortex does.

SUBSCRIBE TO WIRED

This was partly because there hasn’t been any way for programmers to design algorithms that can do much with a general purpose chip. So even as these brain-like chips were being developed, building algorithms for them has remained a challenge.

Eliasmith and his team are keenly focused on building tools that would allow a community of programmers to deploy AI algorithms on these new cortical chips.

Central to these efforts is Nengo, a compiler that developers can use to build their own algorithms for AI applications that will operate on general purpose neuromorphic hardware. Compilers are a software tool that programmers use to write code, and that translate that code into the complex instructions that get hardware to actually do something. What makes Nengo useful is its use of the familiar Python programming language – known for it’s intuitive syntax – and its ability to put the algorithms on many different hardware platforms, including neuromorphic chips. Pretty soon, anyone with an understanding of Python could be building sophisticated neural nets made for neuromorphic hardware.

“Things like vision systems, speech systems, motion control, and adaptive robotic controllers have already been built with Nengo,” Peter Suma, a trained computer scientist and the other CEO of Applied Brain Research, tells me.

Perhaps the most impressive system built using the compiler is Spaun, a project that in 2012 earned international praise for being the most complex brain model ever simulated on a computer. Spaun demonstrated that computers could be made to interact fluidly with the environment, and perform human-like cognitive tasks like recognizing images and controlling a robot arm that writes down what it’s sees. The machine wasn’t perfect, but it was a stunning demonstration that computers could one day blur the line between human and machine cognition. Recently, by using neuromorphics, most of Spaun has been run 9000x faster, using less energy than it would on conventional CPUs – and by the end of 2017, all of Spaun will be running on Neuromorphic hardware.

Eliasmith won NSERC’s John C. Polyani award for that project — Canada’s highest recognition for a breakthrough scientific achievement – and once Suma came across the research, the pair joined forces to commercialize these tools.

“While Spaun shows us a way towards one day building fluidly intelligent reasoning systems, in the nearer term neuromorphics will enable many types of context aware AIs,” says Suma. Suma points out that while today’s AIs like Siri remain offline until explicitly called into action, we’ll soon have artificial agents that are ‘always on’ and ever-present in our lives.

“Imagine a SIRI that listens and sees all of your conversations and interactions. You’ll be able to ask it for things like – “Who did I have that conversation about doing the launch for our new product in Tokyo?” or “What was that idea for my wife’s birthday gift that Melissa suggested?,” he says.

When I raised concerns that some company might then have an uninterrupted window into even the most intimate parts of my life, I’m reminded that because the AI would be processed locally on the device, there’s no need for that information to touch a server owned by a big company. And for Eliasmith, this ‘always on’ component is a necessary step towards true machine cognition. “The most fundamental difference between most available AI systems of today and the biological intelligent systems we are used to, is the fact that the latter always operate in real-time. Bodies and brains are built to work with the physics of the world,” he says.

Already, major efforts across the IT industry are heating up to get their AI services into the hands of users. Companies like Apple, Facebook, Amazon, and even Samsung, are developing conversational assistants they hope will one day become digital helpers.

With the rise of neuromorphics, and tools like Nengo, we could soon have AI’s capable of exhibiting a stunning level of natural intelligence – right on our phones.