IBM WATSON/API/TONE ANALYSIS
“Throughout science-fiction. robots have got a bad rap for their inability to understand human emotion, but a new update means IBM’s Watson AI can now understand our feelings. IBM has introduced three new APIs – Tone Analyzer, Emotion Analysis and Visual Recognition – and the effect is a far more human-friendly Watson.
While the Visual Recognition API is more about deep learning and image recognition, the other two will read our emotions. Tone Analyzer will be able to gauge your emotions just by the tone of your voice, while Emotion Analysis paints an even clearer picture of your current state of mind”. (Chris Moldrich)
“Microsoft has unveiled new technology that appears to have been borrowed from Star Wars. The company is calling it Holoportation, and it works by capturing a highly detailed 3D model of someone, and then beaming it to another user wearing a mixed-reality headset.
The result? If you’re wearing a HoloLens, there appears to be a living, breathing person in front you. Although the quality looks a bit ropey when viewed directly through the HoloLens – and the tech comes with all the limitations of Microsoft’s headset – Holoportation still looks like something out of a sci-fi film”.
QUESTION – TAKE IT TO THE NEXT LEVEL
Is it possible to combine these technologies for self analysis health and wellbeing (projected on a mirror) + PET technology + EMR Technology? A combination of great minds, thinkers, inventors, creators.
Understand yourself and you will understand your PAIN. BY NATASHA PARKER