Twitter-owned Periscope will soon be able to identify what is happening in live broadcasts with the help of its AI team, which is called Cortex. According to Cortex, it is responsible for surfacing the most relevant videos, tweet, pictures and live periscopes from the massive and constantly changing “firehose of content.”
How can the new algorithm help?
The team showed its live-stream scanning system off to the MIT Technology Review, and there it categorized and scanned two dozen streams at once. The micro-blogging giant has built a proprietary computer which is made fully of GPUs to achieve this. The proprietary computer feeds its findings to a deep learning algorithm with the help of these GPUs.
The MIT Technology Review notes that Twitter’s AI team has developed an algorithm that recognizes what is happening in a live feed instantly. Giving an example, they said Cortex will recognize if “the star of a clip is playing guitar, demoing a power tool, or is actually a cat hamming it up for viewers.”
Whatever may be obtained via this scanned algorithm is meant for in-app search. For example, someone may tag a large building on fire as “omg wtf,” but you would be searching for something like “building fire in California.”
If Cortex and Periscope succeed, a scanned stream could identify the fire, and the location data would reveal that the broadcaster was somewhere in the California area.
Cortex will help Twitter sideline “dark content”
Cortex is a team of data scientists, engineers and machine learning researchers dedicated to building a unifying representation of all the content and users on the micro-blogging site to help build a product that people find interesting and use to share new experiences. A Twitter spokesperson said the Cortex team has been working with Periscope to test ways to identify and categorize content in live broadcasts. The team is focused on pairing the advanced technology with an editorial approach to provide a seamless discovery experience on Periscope, the spokesperson added.
An important aspect here is discovery, and filtering content that violated Periscope’s guidelines is the other side. If a user is streaming a pay-per-view event or displaying NSFW content, scanning could help Periscope shut it down automatically. “Dark Periscope” has been in the news recently with suicide and rape taking center stage. Periscope could sideline those streams with a strong scanning algorithm before they are watched by a large audience.
Most Periscope users stick to the rules, and this could be the reason search appears to be its main angle. The project could have quite large implications, but it is still an internal experiment.