I may be wrong here but here's my two cents on the ones I know about
On your orthogonality thesis;
I think you misunderstand what it's trying to convey. It's saying that a intelligent system can pursue any goals, there is no law of intelligence or nature which prohibits it from murdering people if set to do so. It's an anthropomorphic projection to assume that it will choose to not murder people. I don't think any law of nature will prevent it from doing so.
On your instrumental convergence;
Well you don't need a sense of self you just need a intelligent goal oriented system. Maybe you're saying there can be intelligence by whatever definition without goals? I don't know how that is supposed to work.
Eg; Take evolution which is a goal oriented system which has fixed optimisation power(which I am taking as the definition of intelligence), it's goal being having beings which maximise inclusive genetic fitness via natural selection and it plants instrumental goals like self preservation in humans for example.
There isn't really a knowledge explosion. Science is still painstakingly slow and expensive. From your perspective there might seem like there's an explosion, but it's only because AI makes knowledge more accessible to you.
The irony of qualifying authority with credentials on the topic of general intelligence is hilarious.
I may be wrong here but here's my two cents on the ones I know about
On your orthogonality thesis;
I think you misunderstand what it's trying to convey. It's saying that a intelligent system can pursue any goals, there is no law of intelligence or nature which prohibits it from murdering people if set to do so. It's an anthropomorphic projection to assume that it will choose to not murder people. I don't think any law of nature will prevent it from doing so.
On your instrumental convergence;
Well you don't need a sense of self you just need a intelligent goal oriented system. Maybe you're saying there can be intelligence by whatever definition without goals? I don't know how that is supposed to work.
Eg; Take evolution which is a goal oriented system which has fixed optimisation power(which I am taking as the definition of intelligence), it's goal being having beings which maximise inclusive genetic fitness via natural selection and it plants instrumental goals like self preservation in humans for example.
There isn't really a knowledge explosion. Science is still painstakingly slow and expensive. From your perspective there might seem like there's an explosion, but it's only because AI makes knowledge more accessible to you.