Catching Up with the USS Enterprise in a World of AI

Artificial intelligence is bringing us closer to the wonders of the Star Trek universe, which are increasingly no longer the stuff of science fiction.

cq5dam.web.1280.12804
Dell Technologies

In the 1960s, the Star Trek television series brought the vision of artificial intelligence into the living rooms of millions of people. AI was everywhere in the show, in the form of machines that had all the intelligence of humans — and a lot more.

Take, for example, the universal translator on the USS Enterprise. It could translate alien languages into English or any other language instantaneously. That, of course, was all science fiction back in the days when Lyndon B. Johnson was the U.S. president, as were a lot of the other AI applications in use on the starship. It was all the stuff of an imaginary future.

Now, let’s fast-forward to 2020. Today, we are growing increasingly closer to Star Trek’s imaginary future, as AI-driven applications become part of our everyday lives.

Language translation

In the Star Trek realm, the universal translator enabled the translation of spoken languages for real-time communication. This amazing device scanned brain-wave frequencies and used the results to create a basis for translation. While the operating principles of today’s language translation applications are quite different from those of the Star Trek era, we are coming closer to a world of accurate and instantaneous language translation systems.

Using techniques and technologies for natural language processing (NLP), today’s AI-driven translation systems provide efficient ways to help people of different languages communicate with one another. As Dell Technologies data scientist Lucas Wilson explains in a CIO.com blog on NLP, advances in language translation, voice-to-text conversion and conversational agents can now be linked together to enable enterprises to build fully automated customer support systems that can interact with callers more naturally than ever before.

“Then imagine having the corresponding models necessary to translate information back into any language your customers happen to speak,” Dr. Wilson writes. “These models could allow you to create your very own universal translator, ready to help you improve the customer support experience while streamlining your support structure.”

Bit by bit, these capabilities are making their ways into today’s data centers. As an article on the StarTrek.com site notes, “Current automatic translation can’t compete with the 24th century’s, but it’s made astronomical progress in the past decade.”1  

Augmented and virtual reality

Augmented reality (AR) enhances our view of the real world with layers of additional information and virtual objects. Virtual reality (VR) allows us to explore and interact with a virtual world. On Federation starships, immersive AR and VR experiences were always as close as the Holodeck, an environment that combined transporter, replicator and holographic systems. As the StarTrek.com site explains, Holodecks were used to relieve the stress and isolation of shipboard life for crew members and to provide an environment for tasks ranging from scientific simulation to tactical and covert training.

Today, we have forms of the Holodeck at play in countless AR and VR training and gaming applications that allow users to immerse themselves in the experiences at hand. This is the case with the Dell EMC AR Assistant, an app people can use to see clearly how to perform certain hardware servicing procedures on Dell EMC PowerEdge servers, as if they had an assistant explaining each step. The AR Assistant displays simple animations overtop existing hardware, helping users to  see which components they should interact with and what tools they should use.

ar assistant sm Dell Technologies

The AR Assistant app helps to guide users through certain hardware servicing procedures.

Intelligent scanner

In the Star Trek series, the starship crew made frequent use of a tricorder, a handheld device combining sensors, recorders and built-in computing capabilities that could scan and analyze just about anything — from human and alien bodies to elements in alien environments. With this device, Captain Kirk, Mr. Spock, Dr. McCoy and company could instantly diagnose medical conditions of crew members. Back when it made its debut, the tricorder was solidly in the camp of science fiction. Today, teams of bright people are working to bring tricorder-like medical devices to market.

One of these teams is a startup company named Basil Leaf Technologies. It has developed an early version of a device, called DxtER, that can diagnose and interpret multiple health conditions to various degrees, while continuously monitoring a patient’s vital signs. DxtER is built around an AI-based engine that learns to diagnose medical conditions by integrating learnings from clinical emergency medicine with data analysis from actual patients. It includes a group of non-invasive sensors that are designed to collect data about vital signs, body chemistry and biological functions. This information is then synthesized in the device’s diagnostic engine to help users make quick and accurate assessments.

While the DxtER development work in ongoing, the device is already a prize winner. The development team, then working under the Final Frontier Medical Devices name,  won a $2.6 million prize in the global Qualcomm Tricorder XPRIZE competition. Launched in 2012, this global competition challenged teams to develop a consumer-focused, mobile integrated diagnostic device inspired by the medical tricorder of Star Trek fame.2  

Threat intelligence

Threat intelligence gives security teams a better view into the threat landscape. It provides context for trends to monitor, how threat actors behave and where an organization may be most vulnerable to an attack. In Star Trek, Starfleet Intelligence handled this work, collecting and analyzing intelligence information on potential and known adversaries and identifying possible threats.

Today, down on Earth in 2020, organizations have sophisticated AI-driven systems and defense mechanisms in place to detect and stop cyberattacks and other types of threats. Many organizations count on services from companies like Secureworks for these sorts of threat detection and response services. Secureworks provides threat intelligence services that combine human and supervised machine learning intelligence to help security teams gain deep insight into the threat landscape.

Mind reading

Another wonder Star Trek brought us was the Vulcan mind meld, a touch technique that allows a Vulcan to merge his or her mind with the essence of another’s mind purely by using specialized contact via fingertip-points. While we are still quite far from mind melding down here on Earth, we are inching closer to the day when we may be able to use AI to read minds.

Here’s one example. A team of researchers from McGill University and the University of Montreal are making breakthroughs with functional magnetic resonance imaging (fMRI) of people’s brains while carrying out various cognitive tasks. One of goals of this project is to create computational models of how the brain works, and then use those models to train artificial neural networks to map the images to actions quickly and accurately. Yes, we’re talking about mind reading.

To collect the datasets for this ambitious effort, the research team recruited a half dozen subjects to watch videos, look at images and play video games while they are in an MRI machine. The machine allows the researchers to track and record the activity in the brains of the subjects as they carry out their tasks. The researchers expect to gather many terabytes of data over the course of the five-year study, during which time each subject will spend around 500 hours in an MRI machine.

This data-intensive research requires a lot of HPC and data science expertise. To meet this need, the team sought the help of Dell Technologies and Intel, along with the data scientists and supercomputing resources of the Dell Technologies HPC & AI Innovation Lab in Austin, Texas.

Key takeaways

Thanks to steady advances in high performance computing and the technologies for artificial intelligence, we’re moving closer to a Star Trek-like world where machines can think and act like humans. Devices and capabilities that were once solidly in the camp of science fiction are now inching into the mainstream of everyday life — and enriching our lives in countless ways.

To learn more

______________

  1. StarTrek.com, “How Artificial Intelligence Is Getting Us Closer to Star Trek’s Universal Translators,” November 20, 2019.
  2. XPRIZE Foundation “Family-led team takes top prize in qualcomm tricorder xprize competition for consumer medical device inspired by Star Trek®,” April 13 2017.

Copyright © 2020 IDG Communications, Inc.