Comments on: what time is it in london? daringfireball gets its knickers in a knot over the answer/2020/05/22/what-time-is-it-in-london-daringfireball-gets-its-knickers-in-a-knot-over-the-answer/various and sundry notionsSat, 23 May 2020 14:35:31 +0000hourly1http://wordpress.com/By: whbeebe/2020/05/22/what-time-is-it-in-london-daringfireball-gets-its-knickers-in-a-knot-over-the-answer/comment-page-1/#comment-1421Sat, 23 May 2020 14:35:31 +0000/?p=5254#comment-1421In reply to Marc Beebe.

There’s so much wrong with Gruber’s response, on so many levels. It must have been a really bad day for him to go off the way he did.

Your points and observations, as usual, are all well taken. Using AI for TOS violations is far, far worse than anything poor old Siri might have not gotten right. We’re talking Google/YouTube and Facebook primarily. I read a story that classical musicians are getting blocked on Facebook during live performances because the AI has decided that some faction of the performance of a dead musician is in violation of some other musician. The entire automated TOS violation takedown system sucks dead hamsters through a garden hose.

Liked by 1 person

]]>
By: Marc Beebe/2020/05/22/what-time-is-it-in-london-daringfireball-gets-its-knickers-in-a-knot-over-the-answer/comment-page-1/#comment-1420Sat, 23 May 2020 14:24:36 +0000/?p=5254#comment-1420Let’s see, the first problem here is that no one decided to get a data set greater than a handful of attempts. That’s not very scientific sampling. The second problem is the attitude that Siri is “stupid” because it knows there’s a London, Ontario whereas any “sensible” human being probably doesn’t and/or assumes you mean London, England. The third problem is a psychotic over-reaction to this “failure” of artificial intelligence and the extension that it would be cause for eternal damnation if it were a real person. Have I got that about right?
I’m no fan of AI by any means, but frankly I find it more amusing when it goes wrong like that and only frustrating if it’s failing to do some truly important task – usually one which any “sensible” human being should have been smarter than to assign to it in the first place. Like parsing post content for TOS violations or driving a car.

Like

]]>