Computers are already powerful enough to achieve it
I hadn't properly noticed that bit, until another poster, had mentioned it.
But, I don't think we know, what computer power level, would be needed (as a minimum), when proper/useful general intelligence, is invented/created, which might never happen. We will then know (or eventually find out), what computer power level, is needed, to create it, at various, useful (equivalent to human) IQ levels.
I.e. I don't think we know if the existing available computing power levels, e.g. a big server, or set of server racks, full of lots of servers. Is currently powerful enough to do it. It might be, but then again it might not be.
Taking the human brain as an example, is not necessarily going to give accurate results. E.g. Computers have been powerful enough, to beat, even the best/world (human) Chess champion, in a real time game of chess. For a long time now.
EDIT2: On the other hand, me mentioning that the initial problem, of having computers, powerful enough to beat even the best human (at the time), at chess, seems to have already been solved. I suspect, we don't really know (world-wide), how much extra computer power (if any), is needed, to be able to create a more general type of AI.
Looking at the existing human brains computer power, seems to me, like looking at human muscles, to attempt to predict how powerful/fast a car is going to be, before cars have been invented.
It would give an extremely rough, potentially very inaccurate indeed results. Because a machine (car), is significantly different to a human being.
Analogy, to further make the point.
Maybe we humans can remember, a few bytes of information, when remembering a quick, temporary number. E.g. Please remember (without writing it down, or cheating) the 10 digit number, 9746641372, while continually jogging, singing out loud, and answering simple mental arithmetic questions, to be tested on that number, in around 3 hours.
So, call it 4 or 5 bytes of storage (temporary/quick, one-off), in the human brain, when remembering long numbers, without training or being a human memory champion/expert.
But that doesn't tell us how much, ram a computer would need, in theory, before computers have been invented. Modern computers, easily have billions of times that number.
In other words, some approximations, are unwise to do, and use of them, could lead to extremely misleading results.
EDIT: Also, it depends on if the solution while being used, has to respond in real-time, like a real human would. Or if the generally intelligent AI machine, is allowed to spend many minutes, or hours, days, weeks, or even months. Before churning out the same or a better answer, to something a human would have taken a considerably shorter period of physical time, to answer.
E.g. If an AI machine, could write a 100% working/tested, to specifications, program, which is as good as (or better), than something a decent programmer could have written. Even if a humanly programmed project, would have taken, perhaps an hour to achieve (write the program). Even if it takes, a few weeks to create it, by computer. That would still be both an amazing achievement, and potentially useful, for people.