• 0 Posts
  • 62 Comments
Joined 1 year ago
cake
Cake day: June 2nd, 2023

help-circle












  • This comment implies that no humans were involved with operating the AI. Seems doubtful.

    It’s one thing for out of touch executives who blindly replace entire departments with “AI” while fundamentally misunderstanding the role of the department being replaced and the capability of AI, tanking the quality of the product–that’s real self harm for everyone involved; it’s another thing to be advancing the creative processes with more advanced tools and automation, something that we’ve been doing for centuries without much fuss.

    The creative part of voice acting isn’t just in moving one’s lips. The creative part of voice acting is just as much, if not more, in feeling and direction–in deciding if a sound sample produces a certain desired emotion, and if that emotion is valuable to the overall experience or not. This is not the territory of generative AI. This is the territory AGI, which does not yet exist. Producing the sound with your lips is just a small part of that. There’s still a human involved in producing the work of art (and if not, then yeah, we are back at that first category, of leadership ignorant of the creative process, and we should bemoan a crappy product lead by executives who have no clue how to retain talent).


  • knows me extremely well, is able to tirelessly work on my behalf, and has a personality tailored to my needs and interests.

    Those may still be ANI applications.

    Today’s LLM’s marketed as the future of AGI are more focused on knowing a little bit about everything. Including a little bit about how MRIs work and a summary of memes floating around a parody subreddit. I fail to see how LLM’s as they are trained today will know you extremely well and give you a personality tailored to your needs. I also think commercial interests of big tech are pitted against your desire for “tirelessly work[ing] on my behalf”.


  • The big problem with AI butlers for research is, IMO, stripping out the source takes away important context that helps you decide wether the information you are getting is relevant and appropriate or not. Was the information posted on a parody forum or is it an excerpt from a book by an author with a Ph.D. on the subject? Who knows. The AI is trained to tell you something that you want to hear, not something you ought to hear. It’s the same old problem of self selecting information, but magnified 100x fold.

    As it turns out, data is just noise without some authority or chain of custody behind it.






  • Smarter Americans in that past recognized that freedom, including the free market, doesn’t just happen of its own accord, that it has to be defended, legislated. That is how antitrust laws came to be in arguably the most capitalist nation on earth.

    Apathetic Americans now have lost sight of the importance of protecting their freedoms.

    “Illegal” is not just some hypothetical moral absolute. It is the politics of defending one’s values. Americans clearly no longer value either their freedoms or the free market.