{"id":6257,"date":"2023-02-01T10:05:23","date_gmt":"2023-02-01T18:05:23","guid":{"rendered":"https:\/\/technarrativelab.org\/?p=6257"},"modified":"2024-01-29T17:17:12","modified_gmt":"2024-01-30T01:17:12","slug":"generative-ai-musical-edition","status":"publish","type":"post","link":"https:\/\/nostatic.com\/lab\/2023\/02\/01\/generative-ai-musical-edition\/","title":{"rendered":"generative AI : musical edition"},"content":{"rendered":"\n<p>The generative AI (gAI) train just keeps on a rollin&#8217;. This time, MusicLM from Google (<a href=\"https:\/\/arstechnica.com\/information-technology\/2023\/01\/googles-new-ai-model-creates-songs-from-text-descriptions-of-moods-sounds\/\" target=\"_blank\" rel=\"noreferrer noopener\">via ArsTechnica<\/a>), which generates music from different genre&#8217;s based on &#8220;rich captions.&#8221; So we now have fairly commoditized text (ChatGPT), image (DALL*E and MidJourney), and sound\/music. <\/p>\n\n\n\n<p>Perhaps in the near future there will be a growing market for &#8220;human-made&#8221; content &#8211; artisnal if you will. Interesting times indeed.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The generative AI (gAI) train just keeps on a rollin&#8217;. This time, MusicLM from Google (via ArsTechnica), which generates music from different genre&#8217;s based on &#8220;rich captions.&#8221; So we now have fairly commoditized text (ChatGPT), image (DALL*E and MidJourney), and sound\/music. Perhaps in the near future there will be a growing market for &#8220;human-made&#8221; content&hellip;<\/p>\n","protected":false},"author":1,"featured_media":6258,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[2,6],"tags":[],"class_list":["post-6257","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-ml","category-creativity"],"acf":[],"_links":{"self":[{"href":"https:\/\/nostatic.com\/lab\/wp-json\/wp\/v2\/posts\/6257","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/nostatic.com\/lab\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/nostatic.com\/lab\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/nostatic.com\/lab\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/nostatic.com\/lab\/wp-json\/wp\/v2\/comments?post=6257"}],"version-history":[{"count":1,"href":"https:\/\/nostatic.com\/lab\/wp-json\/wp\/v2\/posts\/6257\/revisions"}],"predecessor-version":[{"id":6445,"href":"https:\/\/nostatic.com\/lab\/wp-json\/wp\/v2\/posts\/6257\/revisions\/6445"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/nostatic.com\/lab\/wp-json\/wp\/v2\/media\/6258"}],"wp:attachment":[{"href":"https:\/\/nostatic.com\/lab\/wp-json\/wp\/v2\/media?parent=6257"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/nostatic.com\/lab\/wp-json\/wp\/v2\/categories?post=6257"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/nostatic.com\/lab\/wp-json\/wp\/v2\/tags?post=6257"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}