> rainbows, their ends, and pots of gold at them are not
It's an analogy. Someone sees a rainbow and assumes there might be a pot of gold at the end of it, so they think if there were more rainbows, there would be more likelihood of pot of gold (or more pots of gold).
Someone sees computing, assuming consciousness is at the end of it, so they think fi there were more computing, there would be more likelihood of consciousness.
But just like the pot of gold, that might be a false assumption. After all, even under physicalism, there is a variety of ideas, some of which would say more computing will not yield consciousness.
Personally, I think even if computing as we know can't yield consciousness, that would just result in changing "computing as we know" and end up with attempts to make computers with wetware, literal neurons (which I think is already an attempt)
This can mean one of 50 different physicalist frameworks. And only 55% of philosophers of mind accept or lean towards physicalism
https://survey2020.philpeople.org/survey/results/4874?aos=16
> rainbows, their ends, and pots of gold at them are not
It's an analogy. Someone sees a rainbow and assumes there might be a pot of gold at the end of it, so they think if there were more rainbows, there would be more likelihood of pot of gold (or more pots of gold).
Someone sees computing, assuming consciousness is at the end of it, so they think fi there were more computing, there would be more likelihood of consciousness.
But just like the pot of gold, that might be a false assumption. After all, even under physicalism, there is a variety of ideas, some of which would say more computing will not yield consciousness.
Personally, I think even if computing as we know can't yield consciousness, that would just result in changing "computing as we know" and end up with attempts to make computers with wetware, literal neurons (which I think is already an attempt)