A computer scientist called Dr Ben Shneiderman argues against fully automated AI, suggesting it could absolve humans of ethical responsibility.
We're replacing humans in certain places with systems that are robotic and artificially intelligent. And the designers need to make ethical decisions about what they imbue the software and the robots with. It's becoming a big deal for society.
He goes on to cite examples of where fully automated AI is undesirable, such as the Boeing 737's MCAS flight control system, nuclear reactors or lethal military robots and drones.
However, he seems to be generalising too much for my liking.
I believe there are indeed situations where human input is needed. Until AI can cope with nuance and context, human input is necessary for many tasks. But there are plenty of situations where a robot can act autonomously, such as on assembly lines or when vacuuming a floor. As AI gets cleverer then the number of tasks suitable for a robot with fully automated AI will increase.
Therefore the particular caution I'd urge is that AI is not given complete power over tasks too soon and that, I'd suggest, is what went wrong with the MCAS system.
We must also remember that humans are particularly bad at making decisions themselves, of which there's ample evidence on daily news bulletins.
I tend to agree with the counter argument presented by Missy Cummings, director of Duke University’s Humans and Autonomy Laboratory:
The degree of collaboration should be driven by the amount of uncertainty in the system and the criticality of outcomes.
Nuclear reactors are highly automated for a reason: Humans often do not have fast enough reaction times to push the rods in if the reactor goes critical.
You'd choose option 1 and take the certain £3,000.
However, if you had a choice between:
definitely losing £3,000, or
an 80% chance of losing £4,000.
You'd do the opposite and take the 80% chance of losing £4,000.
Or at least that's what most people would do according to research.
This apparently demonstrates how we view losses in a different way to gains. It's all part of something called Prospect Theory and it won someone the Nobel Prize for a paper they published in 1979. Recent research shows it's still true.
Mark Zuckerberg says Facebook is ready for the job of preventing electoral interference on its platform:
Countries are going to continue to try and interfere and we are going to see issues like that but we have learnt a lot since 2016 and I feel pretty confident that we are going to be able to protect the integrity of the upcoming election.
Although the report goes on to say that Facebook will be less strict with coronavirus misinformation:
On the coronavirus pandemic, Mr Zuckerberg said that while Facebook had and would remove any content that would likely result in "immediate harm" to users it would not stop groups alleging that the infection was state sponsored or connected to the launch of the new digital 5G network.
Instead, they will add a warning label to such nonsense. Although they did recently remove David Icke for the repeated offence of posting unmitigated bullshit.
The thing that baffles me is that anyone takes anything posted on social media seriously. You just have to look at why people post on those platforms in the first place to realise that accuracy and truthfulness are not primary motivations.
The government is suggesting the middle seat on planes should be left vacant, but that still leaves you just a few feet from your neighbour. Let's not forget that cattle-class on a plane can only comfortably seat hobbits at the best of times. I'm pretty sure you have to check your legs into luggage to fly Ryanair these days.
Two metres is what has been touted as optimal distancing, which would probably mean a row to yourself and nobody in either the row in front or behind. It would be delightful but I doubt any airline could run a profit while instigating proper social distancing.
The bullshit web is something I rail against at length myself and Nick Heer's article sums up a lot of my own thoughts.
Violations of users’ intent are nothing new. Ad tech companies like Criteo and AdRoll created workarounds specifically to track Safari users without their explicit consent; Google was penalized by the FTC for ignoring Safari users’ preferences. These techniques are arrogant and unethical. If a user has set their browser preferences to make tracking difficult or impossible, that decision should be respected. Likewise, if a browser has preferences that are not favourable to tracking, it is not the prerogative of ad tech companies to ignore or work around those defaults.
Well, quite. In my opinion this sort of thing qualifies as hacking. It could even fall under the UK's Computer Misuse Act under Section 3. I'd argue they are trying to impair the operation of my computer, specifically its browser, and they certainly aren't authorised by me to do that.
I don't care if a website rejects me if I don't accept them tracking me, but I want to know. I don't want them to let me in and then track me by subverting my browser preferences without my knowledge. Just give me a clear choice. And no, putting up the standard sort of cookie confirmation or privacy agreement is not a clear choice.
As I said in a previous article, the default way in which I should enter any site should be with only (truly) essential cookies active and none of those should have anything to do with tracking or advertising. If I access a site without any intervention, that's the way it should be. If a site won't accept me on that basis it should intervene and tell me clearly that I can only continue if I allow it to track me. Give me the choice. They won't, though, because they probably figure that would scare me away (and they're right in a lot of cases).
I don't mind sites advertising to me or even tracking me (to a certain degree, anyway) and there are many I would give approval to. The thing is, I want complete, clear, up-front knowledge that I'm doing so.
The only way we'll achieve any control over this sort of thing is via legislation, and not the sort of half-arsed thinking that gave us cookie confirmation pop-ups.
The .org top-level domain is controlled by the Public Interest Registry (PIR), a non-profit organisation. The Public Interest Registry is itself a subsidiary of the Internet Society (ISOC), another non-profit organisation.
The Internet Society has plans to sell the Public Interest Registry — and the .org domain with it — to a company called Ethos Capital, which nobody knows much about. Ethos plan to buy the PIR for $1.1 billion via a leveraged buyout using the PIR's assets as part of its leverage, which would meant the PIR would end up with a $300m debt as a result.
This all has to be approved by the Internet Corporation for Assigned Names and Numbers (ICANN) who oversee the domain name system for the entire internet.
ICANN seemed inclined to grant approval but they've met with some bitter opposition to the transaction, including from California Attorney General Xavier Becerra. This is significant because ICANN is incorporated in California and Becerra is therefore responsible for making sure ICANN lives up to its articles of incorporation, one of which states it operates for the benefit of the Internet community as a whole.
Several former ICANN officials are involved in the Ethos Capital transaction, which stinks mightily.
There are 10m+ .org registrations and the worry for users is that the price of domains could be increased if the deal where to complete. Ethos Capital are definitely not non-profit and they could significantly increase prices to take advantage of those who want to hang on to their established .org domains.
Wednesday 15th April 2020 Briefly ...
You can now get wheels for your Mac Pro, should you have one. They're clearly made of diced unicorn mixed with tails of mermaid and then forged by Dwarven blacksmiths on the planet of Nidavellir because they cost $700. For four tiny wheels.
Google are planning to charge for the use of its reCAPTCHA, anti-bot testing facilities. This leads me to have a rant about the constant interruptions that stand between us and website's content these days.
Time can be a bit of a problem besides the problem of not having enough of it.
Relativity sees time as another dimension to add to the X, Y and Z space dimensions, forming what's sometimes called a block universe. The evolution of such a universe is like clockwork and everything follows from the initial conditions. If a suitably powerful demon knew all the properties of every particle in the universe, it could precisely predict the future. In such a universe the distinction between past, present and future is merely an illusion and all the information a universe needs is present at the very beginning.
Quantum mechanics sees things a bit differently. Quantum states are described by a wavefunction and, whilst the evolution of the wavefunction in time can be predicted, the outcomes of individual measurements cannot. Particles exist in superpositions — combinations of states — and there's no telling what you'll find until you make the actual measurement. Such a universe is not clockwork and the future cannot be completely predicted.
A physicist called Nicolas Gisin thinks the problem is mathematical and can be resolved by using a different type of mathematics.
The problem, says Gisin, is how we treat real numbers in normal mathematics. Real numbers are just the numbers we use all the time: 42, -6.3, 9.9999… etc. But the problem with them is that most have a number of decimals that, at any given time, we can only know to a certain precision. We have to zoom in further and make a more precise measurement to get the next decimal digit.
This process of zooming in on a number with more precision is called a choice sequence. Standard maths says none of this a problem. Even though we may not know the absolute precision of a number, we can treat the number as if it nevertheless exists and use it on that basis. Importantly, all numbers follow the law of the excluded middle, which says that either something is true or its negation is true. Either x equals 1 or x does not equal 1 and that seems logical at first blush.
There is however another type of maths called intuitional mathematics and the law of the excluded middle does not apply there. Maybe we can't say x equals 1 and can't say x is not equal to one either. If we have a number like 0.999999 and we were sure the 9s would continue, we could say that x = 1 (because x differs from 1 by less than any finite distance), but we're not sure how that sequence will continue. The next digit might be a 3, for example, and then x would certainly not equal 1.
Don't get bogged down with the maths, the important thing is that a lack of any law of the excluded middle gives us an imprecision that can only possibly be worked out in the future. In the present we only have 0.999999 and, right now, the future of that number is indeterministic, or at least that's the case with intuitional maths. In standard maths we'd say that number already exists in full and we can work with it as we please. It is already either 1 or not 1 even if we don't know which. It's a philosophical difference in some ways.
The point of all this is that by framing both quantum theory and relativity via a different type of mathematics, it may be possible to join them together, which is something that has been elusive for nigh on a century. In doing so we'd resolve the different ideas of time the theories have.
Why did Uri Geller cause so much friction over the years? Well, from my personal point of view, he didn't save Newcastle United from relegation like he promised, but he irritated a lot of people over the years, particularly James Randi.
Fixing a strange problem where macOS Safari doesn't allow you to tick extensions to enable them via preferences.
Wednesday 1st April 2020 Briefly ...
I note TSB has had some problems with their website, but the bit that interested me was that users were met with an Unexpected Error and I have to wonder if there's any other sort of error. Would you release software that's full of expected errors? I could just imagine an error along the lines of: Expected error, we just couldn't be arsed to fix it.
It reminds me of a bit of mainframe software I used to work on called JES2. It used to have an error that simply said Something Wrong, which is hardly a great start when it comes to debugging the error.
Error messages should at least point the user to a potential cause. It doesn't necessarily mean the user can fix it but it might hint at something they could try.
The universe has short-changed us. Less than 5% of it is made up of the stuff we know, the stuff that makes up planets, coffee tables, cars, otters and people. 95% of it is something else entirely and nobody's really sure what that is. About 70% of the universe is postulated to be dark energy and 25% of it is dark matter and it's the latter I'm interested in today.
We know dark matter exists because galaxies and clusters of galaxies simply wouldn't behave the way they do if it didn't. Galaxies would just fly apart is there wasn't something else contributing to their gravity and holding them together. The thing is, gravity is the only force dark matter seems to interact with and that makes it difficult to study. Gravity makes it obvious it's there but it offers up little about what its constituent properties might be.
There is a class of particles called leptons and if those particles were on social media, the electron would have the most followers. It's the only lepton most people have heard of but there are actually six of them. There are two more electron-like particles called the muon and tau and each of these has a corresponding neutrino.
Neutrinos wouldn't be on social media at all because they really don't like to interact with anything. Millions of the things pass through you each day and you just don't notice it. So neutrinos are something that are definitely there but they're hard to spot. That sounds a bit like dark matter and it is logical that scientists might think neutrinos are a good candidate for dark matter.
Some experiments have hinted there might be a fourth neutrino and it has been dubbed the sterile neutrino. Other experiments — observations, really — have detected x-rays coming from distant galaxies and nobody could explain the source of these. Scientists, using no imagination whatsoever, just called this the unidentified x-ray line. Other scientists put two and two together, worked out a way the sterile neutrino might produce the unidentified x-ray line and pitched it as a dark matter candidate.
It all sounds plausible so far but there's a problem. If dark matter is made up of sterile neutrinos and sterile neutrinos produce the unidentified x-ray line, then we should see such a line in our own galaxy. Alas, a recent experiment suggests it's not there. However, some scientists have said this recent experiment is a load of old tosh. Staplers were hurled across rooms in frustration, striking equation-riddled whiteboards. It has caused a bit of a furore.
More observations of this unidentified x-ray line are needed and a satellite launching in 2022 should provide them. Until then, all bets are off and the sterile neutrino may or may not be a candidate for dark matter.
44 vaccines are currently being developed and tested to see if they will work against coronavirus and Wired gives us a nice summary of them.
Perhaps the most sobering part of it is:
The Covid-19 pandemic is accelerating the slow, safe vaccine development process, but even the most aggressive predictions don't see us getting protective jabs until next year at the earliest.
The WHO are running global trials of four of these potential vaccines and some people think Remdesivir is the most promising candidate. Remdesivir introduces errors into the virus's replication process and it was originally developed as a treatment for Ebola.
Rather depressingly, we probably shouldn't pin our hopes on any sort of speedy solution.
Friday 27th March 2020 Briefly ...
Chickens and eggs
In 2011, there were about 6 billion egg-laying chickens in the world and they laid about 1.2 trillion eggs, which is about 3.2 billion eggs a day. That's roughly one egg every other day for everyone in the world.
Let's assume that has all scaled up proportionately to the present day.
Some people don't eat eggs and I wouldn't be surprised if the ratio works out at an egg a day for everyone who's interested in consuming the things (and two a day for me of course).
So why have we got an egg shortage at the moment?
Apparently a chicken will only lay an egg on two out of every three days on average, which is just lazy. How hard can it be? I'm sure with the right encouragement — a stick, perhaps — chickens can lay an egg a day.
So there's really no excuse for an egg shortage is there?