While its worth reading make note that you can replace the word "WiiU" with any console in the past. Working on new consoles sucks. Launches are always messy. Some remember that Xbox development kits where Apple Macintoshes. Imagine that.
The point about the CPU clock speeds in the article is a bit extreme and wrong. Recent trends basically move more and more work into the GPU of your computer.
Wolfgang Engel, the "Shader God" of this industry (among others responsible for Lara Crofts Hair in Tomb Raider), explained it to me like this:
"GPUs move into the direction of CPUs since 2007. They become more general purpose. Fortunately they have a multi-core / multi-threading model that is very efficient and much easier to use.
On a fundamental level the difference between a CPU and GPU is the type of data they work on. GPUs work well on lots of similar "small" data sets while CPUs can work on more generic data. It turns out that games have a lot of those small data sets and that in general they became more common over the last 20 years with large amounts of data generated by computers.
So what we are seeing is that the importance of CPUs decreases in game consoles and the importance of GPUs increase. GPUs take over more and more CPU tasks.
This is why the XBOX One and PS4 have -compared to the PC market- rather slow CPUs but quite fast GPUs."
No reason to complain about low clock speeds of your CPU, simply learn proper shader language and do your work there. And that is where the major difference lies from the last generation of consoles: when the PS3 and Xbox 360 was designed the GPU's weren't developed with that general purpose architecture. Now they are.
Sidenote: as all three consoles use AMD chips as GPU's they basically define the GPU and Shader standard to come. I wouldn't buy nVidia shares now...