In the early days, the only thing that mattered was pixel count. The more pixels you could cram onto a sensor, the better. Some people noticed that the higher pixels count would decrease the performance in low light scenes (for the same sensor size), and we got to a point where you’d have ridiculously high resolutions on extremely small sensors, but you’d need to be a few miles from the sun in order to get good, sharp footage.,
There are all sorts of software tricks that you can employ to improve the appearance of the image to the casual observer, but you can’t conjure up data that just aren’t there. One trick is to use very long exposure, but that causes moving objects to get blurry, or you can do noise reduction, but that also removes actual details.
So what happened that caused the cameras to suddenly improve quite dramatically over the last couple of years? The sensor guys came up with back-illuminated sensors. Basically, the sensors got a hell of a lot better, and now we’re reaping the benefits.
Xda-Developers has a great article on the Sony IMX378 explaining BI sensors and how HDR is achieved. And of course, there’s always Wikipedia.
Competition is a wonderful thing.