AI system infers music from silent videos of musicians

Nancy J. Delong

In a study accepted to the approaching 2020 European Convention on Computer Vision, MIT and MIT-IBM Watson AI Lab researchers explain an AI system — Foley Audio — that can crank out “plausible” new music from silent movies of musicians taking part in instruments. They say it operates on a variety of […]

In a study accepted to the approaching 2020 European Convention on Computer Vision, MIT and MIT-IBM Watson AI Lab researchers explain an AI system — Foley Audio — that can crank out “plausible” new music from silent movies of musicians taking part in instruments. They say it operates on a variety of new music performances and outperforms “several” present systems in generating new music that’s pleasant to listen to.

Picture credit rating: MIT

Foley Audio extracts 2d important details of people’s bodies (25 whole details) and fingers (21 details) from online video frames as intermediate visual representations, which it works by using to design overall body and hand movements. For the new music, the system employs MIDI representations that encode the timing and loudness of every single be aware.

Supplied the important details and the MIDI situations (which are likely to range close to 500), a “graph-transformer” module learns mapping features to affiliate movements with new music, capturing the lengthy-expression associations to make accordion, bass, bassoon, cello, guitar, piano, tuba, ukulele, and violin clips.

Published by Kyle Wiggers, VentureBeat

Go through a lot more at: Massachusetts Institute of Technological innovation


Next Post

Twitter labels state media, government officials' accounts - Software

Twitter will label the accounts of condition-affiliated media retailers, their senior employees and some critical govt officers, the company claimed in a site publish on Thursday. The accounts of Russia’s Sputnik, RT, and China’s Xinhua Information are amid the media corporations that will be labeled, in accordance to a Twitter […]