By Lance Eliot, the AI Trends Insider
What happens when you put multiple clouds together?
If your first thought drifts toward wonderous clouds in the sky, perhaps your answer might be that you end-up with clouds that could be described as cirrus, alto, stratus, or perhaps cumulonimbus in nature. For those that relish watching the formation of clouds, lazily observing as the fluffy crystals gently float along, you can witness quite a dance. You might watch in rapt amazement as they float off toward the horizon.
I don’t want to interrupt your idyllic daydreaming, but the kind of clouds that I was referring to is state-of-the-art computing resources and has become an essential element for any modern-day company.
A recent news announcement that garnered some outsized headlines provides ample indication of how multi-cloud amalgamations are soon going to emerge as a vital cornerstone of commerce and proffer savvy opportunities across all industries and sectors of business.
A cavalcade of hospitals announced that they are banding together to create a supercharged data collection of health data. Top named hospital systems such as Tenet Healthcare, Trinity Health, Aurora Health, Providence, and others are forming a virtual multi-cloud collective, doing so as part of the startup launch of Truveta Inc. (a newly formed company that will aggregate, analyze, and sell anonymized health-related golden nuggets, as it were).
Besides bringing together data on a massive scale that no single hospital system would have, the cooperative aims to go beyond just being a mighty storehouse of health-related info. The belief is that AI systems can turn the abundance of data into actionable results. Aspirations include quicker cures for diseases, personalized medicine will finally become a reality, and vast improvements in health equity.
The tagline: saving lives with data.
Of course, they will have to showcase that the data is well-protected and anonymized. Estimates are that they will initially be able to bring together nearly 13% of the clinical care data across the entire United States. Pulling together this immensity of data is a huge lift onto itself, let alone also ensuring that there are appropriate privacy controls, security locks, and everything else needed to make sure this valuable data asset is used for good and not for bad.
What other industries or business sectors are ripe for a multi-cloud bonanza?
Yes, you might have guessed that I am thinking about the realm of autonomous vehicles. Those readers that follow my coverage know that autonomous vehicles are predicted to demonstrably change our world.
This is not just going to merely alter how we get from point A to point B. All manner of vehicles, including cars, trucks, buses, trains, drones, planes, ships, submersibles, and the like will inevitably and inexorably become autonomous. There will be a wholescale and mammoth disruption in most of what we do, which if aimed right will provide widespread benefits. Take a moment to reflect on how society could be reshaped by having vehicles that no longer require humans at the wheel, and you will come to realize that this emergence will transform our existence.
If the enormity of widespread adoption of autonomous vehicles seems like a rather large bite to chew on, let’s take a focused look at the advent of true self-driving cars and see how a multi-cloud amalgamation will arise and be a tremendous treasure trove for all. We can then consider how this same visionary perspective can be further expanded to encompass the full range of autonomous vehicles.
Let’s first take a moment to clarify what I mean by my reference to true self-driving cars.
For my framework about AI autonomous cars, see the link here: https://aitrends.com/ai-insider/framework-ai-self-driving-driverless-cars-big-picture/
Why this is a moonshot effort, see my explanation here: https://aitrends.com/ai-insider/self-driving-car-mother-ai-projects-moonshot/
For more about the levels as a type of Richter scale, see my discussion here: https://aitrends.com/ai-insider/richter-scale-levels-self-driving-cars/
For the argument about bifurcating the levels, see my explanation here: https://aitrends.com/ai-insider/reframing-ai-levels-for-self-driving-cars-bifurcation-of-autonomy/
Understanding The Levels Of Self-Driving Cars
As a clarification, true self-driving cars are ones where the AI drives the car entirely on its own and there isn’t any human assistance during the driving task.
These driverless vehicles are considered Level 4 and Level 5, while a car that requires a human driver to co-share the driving effort is usually considered at Level 2 or Level 3. The cars that co-share the driving task are described as being semi-autonomous, and typically contain a variety of automated add-on’s that are referred to as ADAS (Advanced Driver-Assistance Systems).
There is not yet a true self-driving car at Level 5, which we don’t yet even know if this will be possible to achieve, and nor how long it will take to get there.
Meanwhile, the Level 4 efforts are gradually trying to get some traction by undergoing very narrow and selective public roadway trials, though there is controversy over whether this testing should be allowed per se (we are all life-or-death guinea pigs in an experiment taking place on our highways and byways, some contend).
Since semi-autonomous cars require a human driver, the adoption of those types of cars won’t be markedly different from driving conventional vehicles, so there’s not much new per se to cover about them on this topic (though, as you’ll see in a moment, the points next made are generally applicable).
For semi-autonomous cars, it is important that the public needs to be forewarned about a disturbing aspect that’s been arising lately, namely that despite those human drivers that keep posting videos of themselves falling asleep at the wheel of a Level 2 or Level 3 car, we all need to avoid being misled into believing that the driver can take away their attention from the driving task while driving a semi-autonomous car.
You are the responsible party for the driving actions of the vehicle, regardless of how much automation might be tossed into a Level 2 or Level 3.
For why remote piloting or operating of self-driving cars is generally eschewed, see my explanation here: https://aitrends.com/ai-insider/remote-piloting-is-a-self-driving-car-crutch/
To be wary of fake news about self-driving cars, see my tips here: https://aitrends.com/ai-insider/ai-fake-news-about-self-driving-cars/
The ethical implications of AI driving systems are significant, see my indication here: https://aitrends.com/selfdrivingcars/ethically-ambiguous-self-driving-cars/
Be aware of the pitfalls of normalization of deviance when it comes to self-driving cars, here’s my call to arms: https://aitrends.com/ai-insider/normalization-of-deviance-endangers-ai-self-driving-cars/
Self-Driving Cars And Multi-Clouds
For Level 4 and Level 5 true self-driving vehicles, there won’t be a human driver involved in the driving task. All occupants will be passengers; the AI is doing the driving.
One crucial factor to keep in mind is that the automakers are each devising their own brand and models of self-driving cars (typically in conjunction with a firm devoted to crafting self-driving capabilities, or having bought such a firm, or having assembled an in-house crew for said purposes).
Thus, automaker X might have their particular brand and models of self-driving cars. Meanwhile, automaker Y will have its own brand and models. The same for automaker Z, and so on. Of course, some automakers are joining together in onesie-twosie type arrangements, but the point overall is that we won’t have a monolith of all self-driving cars being exactly the same.
I mention this because some outside of the self-driving industry have a misimpression that somehow all self-driving cars are going to be identical, consisting of the same AI driving systems, the same sensor suites, etc. Nope, that won’t be the case. There will be a variety of both hardware and software involved in crafting self-driving cars.
Another important element of self-driving cars involves the data aspects.
A self-driving car includes a suite of sensors encompassing video cameras, radar, LIDAR, thermal imaging, ultrasonic devices, etc. These are the proverbial eyes and ears of the self-driving car. The AI driving system receives the data flowing from the sensors and has to try and interpret the data to determine the driving scene and how to best utilize the driving controls of the car.
The data collected from the sensors are initially kept on-board the processing systems within the self-driving car. There is a great deal of ongoing debate and discussion regarding how much of the data is or ought to be stored on-board the vehicle versus instantly purged.
Via the use of OTA (Over-The-Air) electronic communications, the self-driving car can upload data into the cloud of the automaker or fleet operator. Similarly, software patches for the AI driving system can be remotely downloaded via the OTA from the cloud and placed into use inside the vehicle.
There is an additional twist to the data facets.
Imagine that a self-driving car is driving down an ordinary neighborhood street. The sensors are collecting a boatload of data about whatever they happen to encounter. You might assume that the only data being sought involves figuring out where the road is and how to navigate safely on the given street.
The thing is, the sensors are collecting whatever they can get.
Consider the video cameras. The video cameras are not only obtaining visual images of the roadway, but they are also capturing whatever is happening on the sidewalks, and whatever is occurring in the front yards of the houses on the block, and so on. All told, the data from the sensors can engulf a vast array of whatever is occurring around the self-driving car as it makes its way throughout a neighborhood, or a community, or a town and a city. I’ve referred to this as the roving eye.
Okay, this all sets the stage for getting to the gist of this discussion.
Let’s assume that each of the automakers decides to upload the data that was being collected by their fleet of self-driving cars (it could be the entirety of the data or a subset). On each day, a particular automaker or fleet operator is going to collect a massive volume of data about the locales and activities taking place in a given area. Realize that their self-driving cars are continually crisscrossing the geographical realm in which the vehicles are being operated.
Automaker X brings the data from their self-driving cars into their cloud, which we’ll call cloud X. It is vital data that they can use to improve their self-driving capabilities (and, as you’ll see in a moment, has a lot of other uses too). Meanwhile, automaker Y is bringing the data from their self-driving cars into their cloud, which we’ll call cloud Y.
On and on this goes.
Suppose automaker X has their self-driving cars roaming on the west side of town. Automaker Y has their self-driving cars roaming on the east side of town. Logically, this implies that automaker X has a lot of useful data about the nature of the west side of town, while automaker Y has a lot of useful data about the east side of town. It also implies that automaker X has very little if any data about the east side of town (since their self-driving cars aren’t roaming there), and similarly that automaker Y has little if any data about the west side.
Normally, you might expect that these are fierce competitors, and they would each keep their own collected data stored in their respective clouds away from the eyes of their self-driving car rival across town. That does make sense and you can certainly expect everyone to hold their aces close to their chest, at first.
Hark back momentarily to the news story about the hospitals that have decided to band together. We would not normally expect hospitals to do so, especially if they are essentially competing in the same marketplaces. Like any business, they prize the data that they have collected about their “customers” and are apt to keep it private (of course, they also need to abide by stringent regulatory requirements too).
It took a while for the lightbulb to brighten and the right convergence of societal and technological factors to make the timing for a multi-cloud incarnation feasible, viable, and presumably rewarding.
My prediction is that gradually and inexorably the automakers and fleet operators will begin to realize that the self-driving car collected data in their respective clouds can be synergistically amalgamated to proffer added benefits that each alone could not otherwise attain.
It won’t happen overnight. Realistically, you’ve got to expect that they all first need to get their ducks in order. There is enough newness about self-driving cars and the self-driving collected data in the cloud that trying to go beyond their singular efforts is something nearly unimaginable at this time.
You have to crawl before you can walk, and walk before you can trot, etc.
When the timing is right, what sensibly and profitably can be accomplished via a multi-cloud amalgamation of self-driving collected data?
Let’s take a quick look at seven ways to leverage such data (there are many more ways, I assure you).
- Roadway Infrastructure
Many deplore the lousy status of our existing roadways, replete with potholes, insufficient markers, and a slew of problems that endanger drivers and pedestrians alike. Self-driving cars are going to be collecting tons of data about our roadways. This can be used by governmental bodies to determine where to best spend precious infrastructure dollars. It can also be used to predict where human-driven car crashes are likely to occur and shore-up high-priority spots. Using the example of cloud data being collected by automaker X and automaker Y, amalgamating their collected cloud data would empower an assessment of the entirety of the local town, both west, and east, which otherwise by one cloud collection we would only gauge one-half of the local roadway infrastructure.
- Up-to-date Maps
For many of today’s digitally available maps, they are based on a periodic scan of a given locale, such as once per month or maybe once per year. This means that the map that you are using is potentially out-of-date. Self-driving cars are going to be able to update maps in near real-time. The data they collect into the cloud can be used to indicate when a bridge is out of commission for a few days due to heavy rainfall. Envision that via the multi-cloud amalgamation, a comprehensive semblance of mapping would take place over an entire community or city or state, daily and be immediately available.
- Real Estate
If you’ve been looking to buy a home lately, you might have noticed that most real estate agents these days are uploading pictures of the homes they have for sale. These pictures are oftentimes poorly taken or worse are somewhat misleading in the nifty way that the property is portrayed. In any case, the self-driving data is going to have gobs of real estate video and imagery, collected daily. Owners of commercial property can get a daily tally of how their property looks, such as whether it got tagged with graffiti or was properly cleaned yesterday. Just imagine how the multi-cloud amalgamation could be monetized for the real estate industry, and you are already salivating at the possibilities.
I’ve previously pointed out that the data from self-driving cars can be potentially used for crime-fighting. Currently, when a crime occurs, there is a wild rush to discover whether any stationary video cameras managed to catch the crime or the criminals, and likewise, the public at large is asked to provide any smartphone video that might be relevant. The multi-cloud amalgamation collected from the roaming self-driving cars could be utilized to figure out the whodunit and be a handy means of reducing crime in a given area (of course, this raises various notable privacy issues, as do each of these potential uses of the data).
- Pedestrian Safety
Most towns have pedestrian crosswalks that are quite dangerous, yet few realize the dangers involved and sadly only take corrective action after an injury or fatality occurs. Self-driving cars routinely scan for pedestrians to try and determine whether someone is going to suddenly dart into the street or otherwise put themselves into harm’s way by stepping off the curb at the wrong moment. From this data, analyses can reveal where the public is most tempted to jaywalk or where human drivers tend to be least attentive to the rights of pedestrians.
- Human-Driven Cars
Realize that there are still going to be human drivers during the emergence of self-driving cars. In essence, you’ll have some smattering of self-driving cars on the roadways and mixing with some smattering of human-driven cars on the highways and byways too. Perhaps, far in the future, we will have only self-driving cars, but no one can say whether or when that will occur. Meanwhile, self-driving cars are intently observing the actions of human-driven cars (actually, all cars, though especially human-driven cars). This data can indicate places that seem to inspire human drivers to drive recklessly, and there might be ways to lessen the number of annual car crashes (and fatalities) accordingly.
- Anonymized Passenger Behavior
What will people opt to do while riding in self-driving cars? Those that had begrudgingly driven in heavy traffic to work each day will now be able to sit back and relax while the AI driving system contends with the road rage entailments. I’ve pointed out in my columns that an estimated 70 billion hours annually of human driving time will eventually be given back to humanity and can be used for other purposes. Most self-driving cars are going to have inward-facing cameras, doing so to allow for doing interactive Zoom-like discussions while riding in a self-driving car, perhaps doing work meetings while on-the-road or helping your kids with their homework while commuting home at night. The general activities of passengers could be anonymized and used to indicate what people do during their self-driving car journeys, which certainly would be of interest to advertisers (big bucks there!).
For more details about ODDs, see my indication at this link here: https://www.aitrends.com/ai-insider/amalgamating-of-operational-design-domains-odds-for-ai-self-driving-cars/
On the topic of off-road self-driving cars, here’s my details elicitation: https://www.aitrends.com/ai-insider/off-roading-as-a-challenging-use-case-for-ai-autonomous-cars/
I’ve urged that there must be a Chief Safety Officer at self-driving car makers, here’s the scoop: https://www.aitrends.com/ai-insider/chief-safety-officers-needed-in-ai-the-case-of-ai-self-driving-cars/
Expect that lawsuits are going to gradually become a significant part of the self-driving car industry, see my explanatory details here: https://aitrends.com/selfdrivingcars/self-driving-car-lawsuits-bonanza-ahead/
The aforementioned uses of the multi-cloud amalgamated self-driving collected data are but the tip of the iceberg. Uses run the full gamut from fundamental research to everyday practical application.
In some ways, the data would be beneficial for society overall. That’s a big-picture perspective and admirable. There is also a tremendous amount of monetization that can be gleaned from the multi-cloud endeavor.
For the hospitals that are embarking down this same path, they are emphasizing that data can save lives. The same type of mantra can readily be applied to the self-driving see-all data. Pedestrian lives can be potentially saved or at least injuries averted. Car crash fatalities can be reduced. Admittedly, the hospitals have a greater claim to the lives saved moniker, but the point being that the self-driving collected data has a lot of potential in amazing ways that might not at first blush be evident.
Finally, now that we’ve covered the advent of self-driving cars as an instance of autonomous vehicles, widen the scope to include other types of autonomous vehicles.
There are going to be disparate clouds devoted to one commercially utilized autonomous drone versus other such autonomous drones. You can tie their data together and get synergies. There will be disparate clouds across the myriad types of autonomous vehicles. Imagine the synergies from piecing together cloud-based data from self-driving cars with self-driving trucks and toss into the mix the data from those ubiquitous autonomous drones that we keep hearing about (they will eventually get here).
A treasure trove of data awaits.
Come on, autonomous vehicles, get your act together, and move forward to enable the multi-cloud amalgamations to arise.
Think about all of this as you gaze idly up at the cumulus clouds in the sky, and imagine how those computing clouds down here on the earth will soon be forming into cooperatives that provide some pretty equally breathtaking results.
Copyright 2021 Dr. Lance Eliot