The computer science research arm of Sony has launched a cloud-based AI music production tool called Flow Machines Mobile (FM Mobile) to help musicians generate ideas for new melodies, chords, and basslines.
The Sony Computer Science Laboratories (CSL) FM Mobile app, available on the Apple App store and compatible with a range of digital audio workstation (DAW), features a machine learning model that analyses musical data based on the style palette that users select to match the genre and chord progression of the song they want to create. Users can create their own original style palette in the app or choose from various preset palettes created by Sony CSL.
When users press the compose button, the AI will generate eight-bar melodies according to the selected chord progression, Sony CSL said.
“There are parameters such as note duration and melodic complexity, which allows users to have proposals from AI matching their intention,” the company added.
Users can then save their creations and import it to their DAW using Flow Machines Professional (FM Pro), a plugin that Sony CSL developed in 2019 for use in a DAW, or into Apple’s GarageBand.
FM Mobile has been launched in Japan and the United States. The app will also be released in Europe, Sony CSL said, but a date has not been set.
Meanwhile, the Japanese conglomerate’s semiconductor business is set to release two types of stacked event-based vision sensors designed for industrial equipment.
The two sensors, Sony said, employ the company’s copper-to-copper connection technology to provide electrical continuity between the pixel chip and the logic chip, and feature pixel size of 4.86 μm.
These features enable the sensors to detect changes in luminance, as well as sense slight changes in vibration, abnormalities for use in predictive maintenance of equipment, and changes in sparks produced during welding and metal cutting, Sony said.
Additionally, Sony boasted the two sensors are equipped with event filtering functions developed by Prophesee.
“Using these filters helps eliminate events that are unnecessary for the recognition task at hand, such as the LED flickering that can occur at certain frequencies (anti-flicker), as well as events that are highly unlikely to be the outline of a moving subject (event filter). The filters also make it possible to adjust the volume of data when necessary to ensure it falls below the event rate that can be processed in downstream systems (event rate control),” Sony said.
Sony expects to begin shipping the sensors in October.
MORE NEWS FROM SONY
- Sony AI teams up with Korea University to map out new food flavour combos
- Epic Games receives another $1 billion in funding with $200 million from Sony
- Sony expects ¥1 trillion in annual operating income from COVID-19 gaming boom
- Sony Xperia 5 II review: A compact version of Sony’s wide-screen 5G flagship