NEON Wed, 29 Jul 2020 21:32:30 +0000 en-US hourly 1 https://wordpress.org/?v=5.5.1 /wp-content/uploads/cropped-cropped-Neon_SiteIcon_512x512-32x32.png NEON 32 32 Consumption to Co-Experience /blog/consumption-to-co-experience/ Fri, 24 Jul 2020 21:00:05 +0000 /?p=6829 Technologies are literally breathing new life into media and entertainment industry. NEON's Head of Strategy, Bo Moon, discusses how co-experiences will shape the next chapter in Media and Entertainment.

The post Consumption to Co-Experience appeared first on NEON.

]]>

In the Media and Entertainment industry, “content is king,” and companies are challenged to continuously feed the never-ending appetite of consumers.

When Consumer Consumption leads to Content Oversaturation

The Media and Entertainment (M&E) industry thrives on consumer consumption. The major paradigm shifts the industry has experienced over the last decade range from new formats and platforms to new devices and usage behaviors. But every driver of change originated to address the same thing: people’s increased desire to consume.

Credit : Adobe Stock

Consumption on large format screens and linear broadcasts has been replaced by multiple devices and on-demand watching. Fragmentation has led to a massive wealth of options and changing modes of experiences, and the road to content creation is no longer one lane. With every provider expanding its platform offerings and committing billion-dollar budgets for new content, we’ve entered an age of content oversaturation.

Now that consumers’ playlists are overflowing with content, there is an increasing shift from quantity in favor of a more refined experience. 

The next phase for the industry represents the most dynamic change ever, one that redefines the viewer’s relationship to content creators, brands and even the content itself.

Credit : Adobe Stock

When Content Oversaturation calls for New Experiences

Consumers want to do more than digest content. They want to have deeper, more interactive experiences. The ultimate refinement of the media consumer experience is when it evolves from consumption to co-experience. 

When text messages first moved from linear communication to dynamic group chat rooms that span months or even years, messaging became a co-experience. The Facebook Messenger chat room that I have with my high school buddies started years ago and is where we share our experiences, our family updates, and our favorite memes.

Similarly, when it became possible to watch the best gamers in the world stream precision shots and expert strategies via Twitch, gaming became a virtual co-experience, particularly because I could chat in realtime with them as a paid subscriber. Now, gaming platforms like Fortnite are delivering amazing co-experiences as venues for virtual concerts, product launches, and conferences.

Imagine one day, on-screen actors who star in pre-recorded shows will come to life in daily, one-on-one realtime interactions with their fans. News anchors will share the breaking news but can repeat the latest headline when asked. And when the media you watch on screen becomes the media you interact with after the show is over, and engage with on an ongoing basis, consumption will forever change to co-experiences. We are not talking about ten years from now–we are talking about today.

When New Experiences Change the Future

We are entering a new era of possibilities in Artificial Human technology, and Media and Entertainment will be one of the first areas of major impact. Technology exists to break the barriers that prevent scalable generation of visual artificial life. Artificial Humans are not rule-based agents following rule-based approaches. Their micro-features, emotions, gestures – nuances of human behavior – convey natural, life-like communication. What better way to acquire, co-experience and retain your audience than with another human?

As Artificial Humans make themselves available to people on an interpersonal level, their influence will also have huge impact on other industries. I imagine the possibility of watching the upcoming Bond movie, No Time To Die, and afterwards, walking into an Aston Martin car showroom only to be greeted by Bond himself on a large screen kiosk, in a tailored suit with martini in hand. He would then go on to introduce me to the non-spy version of his latest car, the Valhalla. 

The Media and Entertainment industry will intertwine with automotive and retail in ways previously unimaginable. Winners in this new media co-experience world will be those that pioneer new use cases and push the boundaries of the technology for their industries.

Artificial Humans will unlock the concept of co-experiences in media for the M&E and adjacent retail industries, setting up a new paradigm for consumer engagement. New programming will shed light on the daily, normal life of the Artificial Human actor, news anchor, or musician, and allow fans to engage with them in realtime. The possibilities for participation are endless when fans can have a co-experience with their favorite on-screen star.

Artificial Humans will interact in emotionally deep and profoundly personalized ways, and in that new reality, consumers will all become kings of their own content.

The post Consumption to Co-Experience appeared first on NEON.

]]>
Decoding CORE R3 /blog/decodingcorer3/ /blog/decodingcorer3/#respond Fri, 28 Feb 2020 20:00:25 +0000 http:/?p=1 3 Rs- Reality, Realtime, Responsive. NEON’s Head of Technology, Abhijit, discusses the design process of CORE R3 which powers the human-like NEONs.

The post Decoding CORE R3 appeared first on NEON.

]]>

The Human Interface

In the quest to enhance the symbiotic relationship between humans and machines, the two have become increasingly entangled. When technology develops faster than human needs, we start to adapt to the tools that we created to help us. The availability of our technologies has made us more impatient, crave convenience, and in many ways, more machine-like. 

Trends come and go, but our yearning for technology to sound, behave, and look like us has never changed. We, as human beings, welcome human experiences and human connections; we want that understanding and emotional feel.  

Transcending Natural Limitations 

NEON was built on a foundation to make interactions with technology around us more human. Our venture started with us asking whether the interface with technology could go beyond buttons, controls, and voice commands? Can we push the boundaries of what is currently available? To do this, we needed to address the elaborate chain of logical reasoning that we use to define a real human interaction: 

  1. Do they look real? (Reality) 
  2. Is there lag in their movements? (Realtime)
  3. Do they react to what I say? (Responsive)

Together, these 3 Rs (Reality, Realtime, and Responsive) form the pillars of CORE R3. We turned to research as a rich source of inspiration to develop our algorithms, and incorporated some canonical principles: Behavioral Neural Networks, Evolutionary Generative Intelligence, and Computational Reality.

Let’s dive in and disambiguate.   

Breaking Down the Code

Core R3 begins with Neural Networks, a computational simulation where an adaptive system “learns” to perform specific tasks such as recognizing patterns, classifying objects, and making predictions. Information flows through this network when patterns of data are fed via input units. Our Behavior Neural Network was extensively trained by human data, inevitably learning human gestures, movements, and facial expressions. We determined the best set of weights to maximize our model accuracy in generating different human representations. This sets the foundation for the hyper-realistic features and movements in our NEONs and lays the groundwork for further abstraction.  

While neural network architectures can become extremely proficient at classifying human features and behaviors, they’re not very adept at creating them. Evolutionary Generative Intelligence addresses the lack of imagination of traditional neural networks in our CORE R3 Engine. Without any expressive or behavioral inputs, our algorithm learned to generate new, synthetic data in a recursive process. It started producing novel and original content that never happened before. We started creating new realities.    

 

The third model we applied into CORE R3 was Computational Reality, an already mature technology consisting of computer graphics, computer vision, and imaging technology. When we combine this traditional methodology with our two other original machine-learning driven paradigms, the result was an engine that gave rise to individual NEONs.

 

Humanizing Technology 

Artificial Intelligence (AI) has become one of the most pervasive technologies that permeates several aspects of life; you would be hard pressed to escape AI in today’s world. But technology should erase boundaries between people, not create them. Many digital technologies of today cherry pick a sub-circuit function to model, compose it in simulation, and slowly reveal improvements in specific goal-oriented functions (think NLP and voice assistants). But these outputs remain robotic in nature, leading us further away from natural human behaviors. While algorithms can find the optimal solution to a highly constrained problem, these set of instructions have yet to capture and resonate with our feelings. Lines of code still do not equate to human thinking, much less to human intelligence or emotions.

But here at NEON, we are harnessing the elusive elements that comprise the very fabric of who we are. Our next AI engine, SPECTRA, when complemented with CORE R3, will enable interactions with technology in a way that has never been done. The framework will allow for true conversations and connections, and the platform will learn and build a working memory model of your very sentiment, allowing us to naturally interface with technology in the most natural way, the human way.

The post Decoding CORE R3 appeared first on NEON.

]]>
/blog/decodingcorer3/feed/ 0
Welcome to the NEON Blog /blog/welcome-to-the-neon-blog/ /blog/welcome-to-the-neon-blog/#respond Thu, 27 Feb 2020 20:00:53 +0000 http://18.233.1.192/?p=6555 NEON is excited to connect with our community of friends with more updates and insights. Follow our blog as we share information about what we love to do most: create NEONs for the world.

The post Welcome to the NEON Blog appeared first on NEON.

]]>

Welcome to the launch of the NEON blog.

After our CES 2020 introduction, the NEON team is excited to connect with our community of friends with more updates and insights. We’re starting this blog with hopes to share information about what we love to do most –create NEONs for the world and watch them become part of our day-to-day lives at the intense pace of our changing digital technologies.

Here, we’ll publish stories about NEONs and the people who design NEONs to inform everyone how both aspects contribute to NEON’s integration into the world. The how-to’s, the seeds of ideas, and the inspirations whirling around our drawing boards will be brought straight to you.

Thank you for visiting, reading, and following. We’re excited to have you here.

Sincerely,
The NEON team

The post Welcome to the NEON Blog appeared first on NEON.

]]>
/blog/welcome-to-the-neon-blog/feed/ 0