1 – Oculus Investor Greg Castle Unveils $12 million Fund for VR, Self-Driving Cars, and More

Anorak Ventures’ Greg Castle, who got in early on Oculus, announced this week $12 million in seed funds for cutting-edge companies in the areas of VR, AR, autonomous vehicles, robotics, and pioneering innovations using AI and computer vision technologies. In an email to VentureBeat, Castle wrote,

“Over the next ten years these technologies will change our lives. I look for highly specialized teams solving tough problems in these exciting areas that will help bring these technologies to the masses.”

To date, Anorak has made investments in Simbe, Dishcraft, LoomAI, and an undisclosed VR company. Investor and Former Executive at EA Greg Richardson, as well as Singularity University’s Reese Jones, will both serve as advisors to the fund.

(Read the full article at VentureBeat and find fund info at Anorak Ventures)

2 – Industry Leaders Establish Partnership On AI Best Practices

The anticipated union of top AI and related-technology players made formally debuted this week with the announcement of the launch of Partnership on AI (to benefit people and society), a nonprofit organization that will work for “advance public understanding of AI technologies and formulate best practices on the challenges and opportunities within the field.” Members include Amazon, DeepMind/Google, Facebook, IBM, and Microsoft. Experts and relevant leaders in academics, nonprofits, and political entities will be invited to join the organization’s Board. Founding members will contribute both financial and research resources to further the entity’s goals, and will share leadership direction.

(Read the full press release at Partnership on AI)

3 – Orchestra Music Created with the Help of Artificial Intelligence

Technology and business consulting firm Accenture was behind the AI-inspired “Symphonologie” that debuted at the Louvre Pyramid in Paris last week. The musical project is meant to serve as a bridge that connects the abstract idea of AI and illustrates how it can be used in many different modalities in a broad range of domains. In explaining why the company chose this particular mode of communication, Group Chief Executive Mark Knickrehm stated,

“We realized that music cuts straight through the list – there are so many different cultures and spoken languages around the world and we quickly went to music as something that transcends culture.”

The final performance was a collaboration between man and machine that included analyzing words and sentiment in articles on business and technology, categorizing words according to sentiments, matching each emotion with a musical element, and then putting those files into the hands of a composer who arranged the AI-generated melodies. Data visualists created an accompanying visual presentation for a full multimedia experience. The futuristic performance is scheduled to soon go on a global tour.

(Read the full article at CBSNews)

4 – Algorithm Could Enable Visible-Light-Based Imaging for Medical Devices, Autonomous Vehicles

A team of researchers from MIT media lab have created an algorithmic approach that could be a seminal next step in creating next-generation medical-imaging and autonomous-vehicle-imaging systems. The “All Photons Imaging” (API) approach “recovers visual information” from light scattered as an effect of its environment. Using laser beams and a high-speed camera, the researchers have been able to successfully demonstrate light’s time of arrival by reconstructing an accurate image of a pattern cut into a “mask”, a thick sheet of plastic, and then a slab of material designed to mimic human tissue. Continuing to make innovations in this line of study is important, as information from light is much more true-to-life than that gathered through X-Rays or ultrasound waves, and it also allows for images to be better seen in ambiguous situations (i.e. fog or mist) – an existing problem for visioning systems in self-driving cars. The group published its results this week in Scientific Reports.

 

(Read the full article at MIT News and the research report on Scientific Reports)

5 – IBM Unveils Industry’s First Platform to Integrate All Data Types for AI-Powered Decision-Making

On Wednesday, IBM introduced it’s data-driven, cloud-based platform, Project Dataworks, which integrates and streamlines the processing of different types of data (such as IoT, social media, enterprise databases, etc.) in one place. Core cognitive capabilities are part of the underlying solution, with cognitive-based machine learning in place to help escalate the discovery of patterns and models from data. According to IBM, the platform was designed using the same approach done with The Weather Company (an IBM business), which includes flexible data architecture, rapid digestion of many data sources (increased from 50 to hundreds of Gbps), and internet-scale data and analytics. The platform, which is available on IBM’s Bluemix, will likely be leveraged by data analysts and other professionals looking to connect various forms of data and gain insights to help improve customer experiences and transform business models.

(Read the full press release at IBM News)

Image credit: MIT News

MARKET RESEARCH x INDUSTRY TRENDS

TechEmergence conducts direct interviews and consensus analysis with leading experts in machine learning and artificial intelligence. Stay ahead with of the industry with charts, figures, and insights from our unparalleled network, including executives from Facebook, Google, Baidu, Yahoo!, MIT, Stanford and beyond: