This talk is about what it feels like to be a bot developer on the cutting edge of putting AI into bot brains. A bit like the fictional Frankenstein project, my team injects the organic, voice-based personality of an avatar into a Mixed Reality shape: be it a hologram, a robot, a VR or AR experience. Amalia, our first public facing pilot in a shopping centre in Cologne proactively made shopping suggestions and gave tips on where to get nice gifts. I will present findings from our experiment featuring Amalia's growing sense of humor.
Target Audience: Anyone and Everyone
Ask our live AI bot hologram how she/he/it feels in any language: reports from the frontiers of Mixed Reality. We were going to bring Amalia II to Messe Muenchen but she apologises to the OOP audience for her absence: she is really busy in her first job as a multilingual publicist at a marketing academy in Munich. However we are happy to invite you to test her live on a much larger, "open to the public" scale at Munich Central Station (Hauptbahnhof) where she will be working as an Infotainment Wayfinder for one of our First Mover clients!
Today she will give a 5 minute speech which she gave at her global launch in Bavaria last month. In that address in English, Amalia II will "sell herself and her capabilites" to you - about what she can do, how and why! She currently speaks 7 languages fluently: English, German, Spanish, French, Russian, Mandarin and Japanese. If you wish to revisit her speech in any of these languages, we will give you the secret "wake words" for you to ask her in person in your language of choice, as a special treat being an OOP attendee here today. As we scale up, we are adding ca 130 languages to her AI bot brain - we've already tested Finnish, Dutch, Tamil, Hindi and Igbo!
In Germany in April 2019, we completed our (world) first, "public facing" pilot in a shopping centre in Cologne which attracted researchers from Delft University, Harvard and Yale who contacted me for recorded in depth interviews on our R&D. That character "Amalia I" learnt all about 30 shops and services in German. As a 3D hologram avatar speaking German only on site, she proactively made shopping suggestions, gave tips on where to get family members nice gifts and asked if you'd like to know about events at this mall with 10 million visitors per annum. However Amalia I's 2D chatbot replica online was speaking dozens of language, though text-based because Facebook do not allow bespoke Messenger bots to use voice or speech recognition.
Mixed Reality voice-based Amalia I gave directions to the loos, the food court, where the exit to the Ubahn is. She could tell you the Cologne weather in Real Time, and speaking German, she even began to understand Koelsch dialect though she replied in High German! She developed an amazing ability to also understand shoppers speaking German with a heavy accent or inaccurate grammar. In one instance, she even began tutoring a second generation migrant teenager to improve his language skills.
In this short talk, I will present findings from our experiment featuring Amalia's growing sense of humour. This analysis forms part of my world first textbook on business chatbots (forthcoming, Business Expert Press, New York, 2020). It is also the first ever glossary for this Emerging Tech niche, apart from the handbook also functioning as "Guidelines on Procurement for Public and Private Sector Purchasers". Featuring dozens of original, instructive Case Studies never before published - some anonymised due to Confidentiality Agreements, while others are named as they occurred in the public domain but are little understood given the global lack of analysis instead of hype in this IVA niche - my book on AI bots can be pre-ordered from BEP online.
I will make reference to some of these Case Studies in my talk, apart from the human-machine interactions logged in my Medium.com posts about our "Cologne Experiment" and also documented through my ResearchGate account. More interviews and videos about our innovative voice CX tech in Mixed Reality can be found on www.taniapeitzker.expert
or you can follow our Münchener UG online www.ai-baas.com
AI has a rapidly growing presence in today’s world, with applications ranging from heavy industry to education. From accelerating plant operations to information access there are many examples illustrating how digital companions enabled with AI have the potential to fundamentally change many aspects of our daily life; especially when it comes to the way we as humans interact with our environment and the workplace. This talk will explore how we can realize intelligent digital companions for enhanced industrial products, services and solutions.
Target Audience: People interested in applied AI in industrial domains
Prerequisites: Basic understanding of industrial data driven applications
Artificial intelligence (AI) has a rapidly growing presence in today’s world, with applications ranging from heavy industry to education.
From accelerating plant operations to information access, there are many examples illustrating how digital companions enabled with AI have the potential to fundamentally change many aspects of our daily life especially when it comes to the way we as humans interact with our environment and the workplace.
Industrial domains are beginning to explore the potential of AI and digital companions driven by the rapid increase of connected devices and digital processes. Large amounts of diverse data and knowledge such as video footage, records tracking product life cycles and digital twins allow to connect engineering know-how, data and algorithms towards new insights and applications. Digital companions are an ideal approach to enable us with the vast amount of diverse information and complex decision making required in an ever-increasing pace.
With the potential benefits of industrial AI and digital companions the session will explore how we can realize intelligent digital companions and enables new and enhanced products, services and solutions.