I'm going to go out on a limb here with all the talk about compute power leading to more accurate modeling / prediction of the weather.
I didn't sleep at a Holiday Inn or anything, and I don't play a meteorologist on TV. However, being an engineer, I can sympathize with the lack of compute power and being unable to accurately model things in a fixed time window.
Lets say we had a 100% accurate model, which took 48 hours to run on supercomputer A.
Obviously we couldn't get tomorrow's forecast out today as we wouldn't have the 100% accurate data out for 48 hours.
If we could drop the accuracy to 80%, it would take 12 hours to run on same supercomputer A, then we may have a fighting chance of putting something out for tomorrow, but with 20% uncertainty.
I think that in a nutshell that's what's wrong with the GFS (Upgraded) model. They have to run it at a less accurate setting (less 'grid' points) to have it run in a meaningful time period.
I don't know if the Euro model (ECMWF) is more efficient, or they have a larger more powerful supercomputer to be able to run it with more grid points, but it seems like they are able to run their model with more grid points which increases the model's accuracy.