Is the Internet threatened by the rising seas?

Is the Internet threatened by the rising seas?

In the collective imaginary the internet is in the cloud, when it is actually much more tangible than we want to think. It spreads across the globe using an underground network which grows according to the demand for internet access. It just so happens that this network is in danger. Indeed, the rise of sea levels due to global warming threatens this internet network located on the coast. The damage caused by this rising water could greatly affect our modern lifestyle.

A tangible and massive internet network

The internet is described by the Cambridge dictionary as “the broad system of connected computers around the world that allows people to share information and communicate with each other”. It has three main components: the end-user equipment, the data centres and the internet network.

This network is itself composed of several elements such as optic fibre cables, hardware servers, data transfer stations and power stations. These elements, all interconnected, weave a network to transmit information from one end of the world to another, and it is quite difficult to estimate its length. In 2014, there were 285 submarine communication cables or about 550,000 miles. It is difficult to gage the size of the terrestrial network, as it grows based on the demand, and the newly installed cables intermix with the old ones.

In the United States, it is estimated that most Internet infrastructure was built around the 1990s and 2000s. At that time, the development of the network followed the development of major American cities. Today, operators tend to install network extensions along with other infrastructures such as roads, railways or power lines. In some areas of the world and throughout history, cities and megacities have developed along the coastlines; portuary cities that were synonymous with wealth, opportunities, and businesses. These attractive and often densely populated cities are now facing a danger: the flooding of their internet network.

The rising seas gaining internet ground

Paul Barford, a computer scientist, and his assistant student, Ramakrishnan Durairajan, undertook a mapping of US internet infrastructure. As the infrastructures are private and belong to the operators, the locations are kept mostly secret in order to avoid any possible damage. In mapping the network, they observed that it was becoming denser in areas of high population. There are often coastal cities.

They presented their information to Carole Barford, climate scientist. and they became aware of the risk of flooding part of the network. They decided to superimpose the map with that of the rising sea level due to global warming by the National Oceanic and Atmospheric Administration (NOAA). Through their research, they estimated that in 2033, about 4,000 miles and 1,100 traffic hubs would be underwater in the US. For New York City, about 20% of its internet network will be underwater.

We should not underestimate the repercussions this flood would have on our current lifestyles. Many services work through the internet, such as traffic lights, medical monitoring or cash dispensers. In the past, some cities had suffered blackouts due to flooding. A recent example: in 2012 during Hurricane Sandy, 10% of the city of New York was deprived of electricity.  

The problem is that the terrestrial network is designed to be water resistant, but not to work under water.

Unlike Internet Networksubmarine cables, cables buried in the earth are protected mainly with plastic. There are not adequately protected in cases of floods or frost. And with part of the network being a few years old, it is possible that it is even more fragile than the new extensions.

It was during the Applied Networking Workshop in Montreal, July 16, 2018, that the three scientists presented their study concerning the territory of the USA. Carole Barford said “The 15-year predictions are really kind of locked in,” nobody can change what will happen. The main cities involved are New York, Miami and Seattle.

Saving the Internet … from itself?

“If we want to be able to function like we expect every day, we’re going to have to spend money and make allowances and plans to accommodate what’s coming” said Carole Barford.” “Most of the damage that’s going to be in the next 100 years will be done sooner than later … That surprised us. The expectation was 50 years to plan for it. We do not have 50 years, “added Paul Barford.

So, what are the solutions to avoid this submersion of the network?

The first would be to be able to locate all the infrastructures that compose / form the Internet network. Despite the risk of voluntary degradation, it is necessary to identify the infrastructure that will be underwater in a few years. The study predicts that about 4,000 miles and 1,100 traffic hubs will eventually be underwater. Their estimation is made according to the networks they knew about. This study must also extend to all continents and countries. As rising water levels are a global effect of climate change, many coastal cities are likely to be affected.

In order to limit the impact of rising water on the internet, operators can envisage different solutions. Strengthening the current network, moving it further inland, or ensuring that computer signals avoid submerged areas. However, these solutions are not perfect or permanent. Strengthening infrastructure will only work for so long. Avoiding submerged areas will impact the quality of the internet network and could cause latency. Moving existing infrastructures or creating new ones will require significant financial investments that could affect the end user.

Our internet use seems to be in danger. However, does it contribute to its own destruction? The internet is not as green as it seems. We power data centres, one of the main components of the internet, with unsustainable energy sources, creating carbon emissions. Forbes estimated that the carbon footprint of data centres alone is equivalent to that of the global aviation industry, or 2% of global emissions. The emissions of carbon dioxide, due to our increasing use of the internet, are one of the causes of the melting of the ice caps and rising water levels.

Wouldn’t it be ironic if our growing internet addiction was its own worst enemy?

The invisible pollution of the internet

The invisible pollution of the internet

What If the internet became the primary cause of global warming? Ian Bitterlin, a data centre expert, estimates that by 2030, the Internet will consume 20% of the world’s electricity. Today the energy consumed by the internet is, for the most part, not of green origin. It generates an ever-increasing carbon footprint and has a detrimental impact on global warming. Large companies face social pressure and increasingly frequent investigations from independent organisations, and now are embarking on a race for a Green Internet.

The energetic greed of Internet

A high power consuming global network

To determine the energy consumption of the internet, one must ask what the Internet is. According to the Cambridge Dictionary, the internet is “the large system of connected computers around the world that allows people to share information and communicate with each other”. A study conducted by Ericsson and TeliaSonera determined that the three most energy-hungry internet components of this “broad system” are the end-user equipment, data centres and networks.

The end-user equipment

According to a study from the Centre for the Digital Future in the United States in 2017, an American would spend an average of one full day per week connected to the internet. A study from Statista indicates that teenagers are even more exposed to internet: they spend about 4 hours on the internet a day, a little over a day in a week. These numbers are just further evidence of the constant connectivity we experience daily. To stay connected, we use devices that we regularly recharge, thus consuming energy.

The data centres

Data centres are also very greedy. A data centre is “a place of computers that can be kept safely” according to the Cambridge dictionary. Each click, each message sent, each video watched solicits these computer farms. They use electricity to operate, but especially to keep cool. The cooling functions of computers alone account for 40 to 50% of the electricity consumed. McKinsey & Company estimate that only 6% to 12% of the power is used to compute. The remaining part is used to prevent a surge in activity that could crash their operations.

To illustrate the amount of energy consumed by a data centre, Peter Gross, an engineer and designer of power systems for data centres said: “A single data centre can take more power than a medium-size town”. In France, the energy consumption of data centres is higher than the electricity consumption of the city of Lyon (2015, the French Electricity Union). Data Centres’ global energy consumption is up to 3% of the global energy consumption wrote The Independent in 2016.

The internet network

We can also see an increase in the development of the networks which allow access to the internet. The components of this network are for example DSL, Cable Modem, Fiber. These networks work also thanks to energy.

 

To determine the energy consumption shares of these three major Internet components, the ACEEE assessed in 2012 that for a gigabyte of data downloaded, the network is consuming 5.12 kWh power: 48% from data centres, 38% of end-users equipment, and 14% of internet networks.

 

The vagueness of the exact global consumption

Determining the global energy consumption of the internet is complicated. The Centre for Energy-Efficient Telecommunications (CEET) tried to do this once. It estimated that internet consumption accounts for 1.5% to 2% of the world’s total energy consumption in 2013. If we compare this figure with the use of other countries, the Internet would be the 5th country consuming the most energy in the world. In 2014, Jon Koomey, Professor at Stanford University and known for describing Koomey’s law, estimated this consumption to be at around 10%. However In 2017 Greenpeace estimated it at the lower rate of 7%.

There are a few reasons that can explain this critical difference. The main one being that when end-user equipment consumes energy, this energy is not necessarily used to connect to the internet. A laptop or computer can be used offline to play video games. Allocating the share of electricity used for the internet connection is therefore very complicated. Some experts prefer not to count the energy consumption of these devices so as not to distort the numbers. Besides, experts expect this power consumption to double every four years. The Guardian predicts that in 2020 the internet will reach 12% of global energy consumption.

With great power come great sustainable responsibilities

The dark side of the power

The problem with the energy consumption of the internet lies in how to track what kind of energy the internet network is using. As Gary Cook, a senior policy analyst at Greenpeace, said: “How we power our digital infrastructure is rapidly becoming critical to whether we will be able to arrest climate change in time. […] If the sector simply grew on its current path without any thought as to where its energy came from, it would become a major contributor to climate change far beyond what it already is.” Indeed, in 2016, the Independent wrote that the carbon footprint of data centres worldwide was equivalent to the global aviation industry, which is up to 2% of global CO2 emissions.

Some associations have therefore investigated to determine what the share of renewable energy that the data centres use to consume is. The Environmental Leader estimated that in 2015 Google and Amazon used at least 30% of fossil energy to power their data centres. In 2015, Lux Research company found out through a benchmarking on data centres owned by Google, 4 out of 7 were dependent on coal energy. In 2012, Greenpeace released the report “How Clean is your Cloud?” informing about the respect (or not) of the environment of some companies had through the use of their cloud and their data centres.

The Green Power Race

These studies by different organisations have created a race for green power for data centres of large companies. Google, Apple, Facebook and Amazon now provide 100% renewable energy to their data centres or are aiming towards this objective. As an example, Amazon has been powering its servers with at least 50% renewable energy since 2018. However Greenpeace recently contradicted this information and would estimate it only at 12%. Greenpeace also points out that the change triggered by these big Western companies is not enough. Indeed, sizeable Chinese web companies such as Baidu, Tencent, show very little transparency, communicating little about their energy consumption, or their use of green energy. They face little access to renewable energies due to monopoly utilities.
The GAFA are also under the spotlight; medium and small data centres remain off the radar.

Nonetheless the International Energy Agency’s (IEA) announced that despite the increase in workload for data centres (about 30% in 2020) the amount of electricity used would be up to 3%. Data centres are becoming more and more energy efficient.

 

The internet remains the most important source of information and has also made it possible to create less polluting solutions. Reading an email is more eco-friendly than printing a paper. Using an app to find a car parking space is more environmentally friendly than driving around in circles to find one. if you find yourself feeling concerned about this invisible pollution that we generate daily, rest easy in the knowledge that the Internet also contains tips to reduce its own electricity consumption.

 

Sources:

 

 

 

Digital Technologies for Employee Engagement

Digital Technologies for Employee Engagement

My last blog was about the key rules to apply for better employee engagement. I had the opportunity to attend the first Open Innovation Contest by the CA Village in Grenoble, between Start Ups and big companies. And guess what? All selected start-ups presented products to better engage employees. This confirms that this is a key concern today for companies.

Let me share the different solutions that have been selected by the sponsors: HP, Mazard and Mutualia and analyze how they are aligned with my key rules of engagement detailed in Digital Me Up How to keep people engaged.

Open communication and no more silo

 

Who has never complained about the number of newsletters received per week and finally kept unread? Or discovered by chance that the person on the other side of the partition wall knows all about your current topic of concern and will be able to solve your problem in few minutes?

Steeple developed and launched in 2017 an interesting Solution as a Service that fixes these miscommunication issues.

They replace company bulletin boards with touch screens and easy to feed information. Community managers can load content per pre-defined cloud containers and in one minute distribute information across the company. It shows on screens or on employees’ mobile phones.

No way to miss the company core strategy, the last weekly sales numbers or the basket-ball event organized next week.

And it is easy to publish classified ads and ask for advice on your topic of concern.

The solution is very well packaged, easy to use and easy to maintain which is critical when you deploy such new tools.

 Team learning goes with team playing

 

Kaperli is another dynamic structure specialized in serious games. The start-up has a full offering based around creativity, innovation and digital culture.

The founders acknowledge through their own education and training experience, that the best way to learn is to learn in team and learn by doing.

The company has developed a box called Sesame that take teams through selected situations. Through these situations, people can learn about new topics, make assumptions and solve problems together.

We are far from classic webinars and passive training programs.

In addition, they can co-create a specific solution with you if the off-the-shelf offering does not cover all your needs. Collective intelligence is truly valued by the Kaperli solution.

Sales people like competition

 

The last start up, Motivate.me is a newly born one. Their target is to drive motivation within sales teams.

Indeed, sales people are massively impacted by the digital transformation; e-commerce is able to drive transactional sales effectively and data-driven marketing can target customers per real opportunity. Expectations from sales team are increasing.

The idea of Motivate.me is to promote sales teams to super-heroes. The solution connects each of the sales people to the CRM system, giving them funnel visibility in real time and sharing best practices among teams.

But sales rep like challenges and companies can organize contest between teams and promote winners. It brings adrenaline and fun to the sales organizations.

It is an interesting idea that will need to be monitored carefully; it brings automatic feedback to teams but can also increase stress to people and put them under constant pressure.

 

As a conclusion

Indeed,  the three start ups have products and solutions that contribute to successful rules of engagement:

  • have a leader sharing his vision and a company with values aligned to employees
  • trust employees and give them autonomy to deliver
  • listen to people and give regular feedback
  • deliver training to keep up with technology change or need for new skills
  • associate employees into change through collective intelligence

The three solutions might not be enough to solve employee engagement; as said in the previous post the human factor is critical. HR and managers are here to set the right framework, to trust and listen to employees. However, these technologies are real good assets to consider facilitating communication and team collaboration.

 

 

Sources

Whitepaper on engagement: https://www.glintinc.com/resource-center/

Motivation factors: https://www.maddyness.com/2018/04/02/tribune-tuer-motivation-de-vos-employes-10-lecons/

Photo/Video/Infographic credits: NE

 

 

 

 

How Artificial Intelligence is going to change the healthcare sector?

How Artificial Intelligence is going to change the healthcare sector?

Artificial intelligence (AI) is a succession of algorithms, which enable the processing of an infinite amount of data in a very short time and proposes results that are close to human intelligence.

According to Tractica, Al will represent $ 11Bln worldwide by 2024. Another market is also growing at a phenomenal speed: the global market of e-health, which will reach nearly $ 400Bln in 2022 according to Grand View Research.

AI in the health sector opens up very promising opportunities for improving the quality of care for the patient through more personalized and predictive care. Furthermore, it can help and support health professionals in order to take more adequate and quick decisions in their daily work.

Today, health is mainly about curative medicine. AI will make it possible to switch to even more preventive and personalized medicine.

A faster diagnosis

Thanks to AI, it is possible to better detect symptoms and predict the deployment of a disease by using analytical results like medical imaging, which are, for instance, not detectable to the naked eye.

This information makes doctors able to establish earlier, as well as a new diagnostic hypothesis and to formulate more personalized therapeutic proposals. Indeed, diagnosis and therapeutic strategy are more adapted to the patient’s needs, environment and lifestyle.

For instance, in the Parisian hospital La Salpêtrière, a platform of AI has been developed in order to revolutionize the treatment of diseases such as liver or breast cancer. Medical imaging methods combined with deep learning and big data analytics allow better extraction of biomarkers of disease progression.

For cardiologists, a startup called Cardiologs expert in Machine Learning developed an automatic electrocardiogram (ECG) interpretation solution that works in real time thanks to AI.

A better patient care

AI also makes possible to monitor the patient’s condition in real time. This may include, for example, monitoring his physiological condition, describing his symptoms, or interacting with his environment.

More and more, the collection of symptoms is no longer only done during the patient’s consultation with his doctor.

Assistance robots for elderly will be further developed in order to enable them to operate in distance and to help them to make accurate gesture.

AI software is now fitted with sensors and is able to react to touch, sound, light, recognizes their names and adapts to the expectations of their interlocutors.

Through home automation technologies, AI can also allow seniors to simplify their daily lives: some objects or cameras can understand what they see in real time, check the health condition of the patient and alert the family of the doctors in case of unusual situations.

AI robots will not replace the care assistant in the future but may be able to take over tasks to relieve patients. For instance, robots could help with lifting, moving, company with the elderly, monitoring their data health…

Intelligent prostheses aim also to repair the human body or even increase physical performances: artificial limbs or organs. Thanks to AI, it is now possible to perform almost all movements. For instance, Exoskeletons will give to paraplegics the ability to get up or climb stairs.

Also, the American start-up BrainRobotics has developed an AI prosthetics Hand that is able to manipulate objects. The prosthesis is equipped with a digital camera and a microcomputer in order to identify the object to take. The prosthesis is, therefore, able to identify different shapes from databases grouping many pictures.

It raises many ethical issues, including the protection of privacy and personal data, but also the consequences of blurring the human-robot border – a border that can be quickly crossed by the user.

The real impact on surgery

Computer-assisted surgery now makes it possible to improve the precision of gestures or to operate remotely. In 5 years, a system will be set up to analyze in real time the evolution of a surgical operation. The computer will memorize surgical procedures and will allow the creation of algorithms.

Thanks to the algorithms, this system can alert the surgeon if his action becomes “risky”, either because it does not correspond to the expected procedures compared to the other cases of surgery memorized by the algorithm before, or because it deviates from the trajectory defined during a simulation on a 3D model.

A first trial has already been conducted on the basis of 120 records of removal of the gall bladder and after 50 interventions, the computer had already understood that this surgery is still taking place in seven steps.

 

To sum up, here is a chart illustrating the areas of application of AI

AI healthcare