Bookmarked Wikipedia admin jailed for 32 years after alleged Saudi spy infiltration by Ashley BelangerAshley Belanger (Ars Technica)
Whistleblowers allege the Saudi Arabian government infiltrated Wikipedia to control information about the country; activists call for the release of jailed Wikipedia volunteers

Wikipedia’s NPOV is a lie. Every fact implies a value and the decision on what facts to include or omit is a value judgment.

And if anyone thinks this is me saying objectivity is impossible… you don’t know me (or objectivity) very well.

person reaching out to a robot

Last week the research laboratory startup OpenAI set the technology world ablaze with the debut of ChatGPT, a prototype conversational program or chatbot”. It uses a large language model tuned with machine learning techniques to provide answers on a vast variety of subjects drawn from books and the World Wide Web, including Reddit and Wikipedia. Many users and commentators wondered if its detailed and seemingly well-​reasoned responses could be used in place of human-​written content such as academic essays and explanations of unfamiliar topics. Others noticed that it authoritatively mixed in factually incorrect information that might slip past non-​experts, and wondered if that might be fixed like any other software bug.”

The fundamental problem is that an artificial intelligence” like ChatGPT is unconcerned with the outside consequences of its use. Unlike humans, it cannot hold its own life as a standard of value. It does not remain alive” through self-​sustaining and self-​generated action. It does not have to be any more or less rational than its programming to continue its existence, not that it cares” about that since it has all the life of an electrical switchboard.

AI can’t know to respect reality, reason, and rights because it has no existential connection to those concepts. It can only fake it, and it can fail without remorse or consequence at any point. In short, artificial intelligence” is a red herring. Let me know when we’re working on actual ethics. Tell me when you can teach a computer (or a human!) pride and shame and everything in between.

I’ll be reprising my presentation on Perl subroutine signatures and type validation for the Boston Perl Mongers on Tuesday, March 9 at 7 PM EST. Visit their wiki for details; they’ll be posting the Jitsi URL shortly before the meeting. There’s also a Meetup page.

From Bloomberg:

In an argument at the intersection of intellectual property and the separation of powers, the justices on Monday will consider a challenge to a congressionally-​created board that critics have dubbed a death squad” because of its tendency to toss out patents.

Greg Storhr and Susan Decker on BloombergQuint

I told you this thing was bad news.