Skytopia > Projects > Technology/science articles > The Top 20 Applications for an Infinite Fast Computer (article first created on 5th July 2009).

NEXT PAGE... (3)

Page 1 | 2 | 3


The Top 20 Applications for an...



The Travelling Salesman is a classic problem. The task is to join the dots, and find the shortest route (the red line path above is one such route, but it may not be the best). It all sounds easy, but it's been baffling mathematicians for decades.

12: Vehicle routing problem


The "Travelling Salesman" is a classic problem devised in 1930 and has since been the subject of much study. Despite its intrinsic O(n!) complexity, it's got to a point where some complicated heuristics can solve for millions of cities within an accuracy of 1-3% (perfect accuracy may still require O(2^n), at least for worst-case scenarios).

But real life always tends to throw a spanner in the works. There will often be added complications such as travel costs and capacity, window restrictions, and different start/end locations for vehicles. All of these can be classed under the more general category of the Vehicle Routing Problem.

Good progress has been made in this regard, using heuristics such as Tabu search (ref). But until exponentially faster CPUs pop into existence, we will continue to spend time tweaking parameters for special case application algorithms, and still obtain results that won't quite be optimal.





Created by the amazing artist - Naohisa Inoue. Apparently, they sell infinitely fast computers the size of bubblegum for the price of peanuts in the market store shown here. That would help in folding proteins efficiently and relatively quickly.

11: Protein design/structure and protein folding modelling/simulation


The simulation of a single protein fold currently takes computer years to do what nature does in microseconds (around 30 to 100 trillion times faster apparently). Folding@Home currently utilizes 100,000 processors working in parallel, and that's a massive improvement, but we're still down to nearly 1 billion times slower than realtime. It will take them very approximately 100 years to complete their goal.

The fast simulation of protein folding would help us to understand and find a cure for Alzheimer's, AIDS, and cancer much more easily. At least in theory. It would seem that the benefit of infinite CPU speed will not give us an instant 'magic cure' for anything, but the results will rather help steer us in the right direction.



10: Unification of custom chips



Courtesy of Intel, their upcoming Larrabee CPU/GPU, and not for example, the schematic of the hoverboard from BTTF.
With the CPU running at chronic speeds, we can forget about graphics cards, co-processors, custom chips, and anything of that sort. This not only unifies the architecture, but no doubt saves countless millions of man-hours needed to research CPU design and co-processors in the first place. The CPU itself would be reduced to its simplest design, and any sound or graphics output can be sent directly from the memory/CPU to the audio/visual device.

To some extent, the CPU and graphics card is already converging, as they both aim to dig into each other's traditional territory. Graphics cards can already be used for general purpose computing, and as CPUs increase their number of cores, no doubt they will start to encroach on the graphics card functions (e.g. via raytracing). Intel's new Larrabee processor seems to set a precedent by combining many advantages of both the CPU and GPU, though only time will tell whether it will find serve either purpose very well.

It's weird this one, because we theorized about the idea 16 years ago in an old fanzine article on the 'future of computers'. Read it for a laugh and for some strangely accurate predictions which may yet come to pass.





[Source: US Air Force]). Weather will be as unpredictable as the stunning Aurora Borealis after around two weeks.

9: Weather forecasting


Currently, we can predict the weather well around 5-8 days ahead. The theoretical limit is around two weeks. After that point, chaos theory takes over, and it's anyone's guess whether it will rain or shine.

Of course, predicting the weather more accurately will help farmers know when to plant/harvest crops, construction companies when to build, shipping/transportation companies what routes to take, and can help us to forecast very dangerous weather (saving property and lives). And it would seem that computational power is the most limiting factor in our ability to do so (apart from for dangerous weather maybe where observational data is limited over the oceans). I quote from here:
    In short, computing power [rather than observational data] is the limiting factor when it comes to extending the range and accuracy of weather forecasts. Therefore, the future of computers will largely determine the future of forecasting.
Infinite speed may be overkill for this problem, but it seems that at least a dozen orders of magnitude more CPU power will be required to reach the limit of weather forecasting.


8: Graphics (creating, rendering, modelling)



Source: Grzegorz Tanski. In the future, all games and 3D software will use full global illumination. Unless the scene objects are static, I wouldn't hold your breath however, because CPU time goes through the roof...
One obvious application for a mega-fast CPU would be for 3D modelling software. The rendering equation can be solved perfectly - fully ray-traced images with global illumination can be used at all times as if it were the final render.

Even 2D programs such as Photoshop would enjoy the speed up as working on multiple layers with complex gradients and textures would be a breeze.

Here's two more grand ideas:

Unification of vector and bitmap editing

    For 2D graphics, there would be a nice convergence of vector editing (structured drawing) and bitmap editing (raster/pixel-style painting). Vector editing has always had the advantage of keeping track of the points and mathematical definitions for every shape on the screen. But in a number of regards, it has always fallen short of replacing bitmap editing.

    A perfect example is the smudge or blur tool. Two significant problems arise - namely the representation of the blurred area, and the CPU speed. The former problem can be at least partly solved by representing the blurred area as a new pseudo object (as if the smudge/blur has been freshly drawn in by the user each time the picture is edited or refreshed).

    More troublesome is the speed issue though; Pictures comprised of vectors can only contain so much detail before the PC is choked to death by millions of points. Drawing then becomes cumbersome and tedious to edit.

    With infinite speed however, both of the above problems can be overcome, and at last we can use a unified graphics editor that acts like a bitmap editor with effectively infinite resolution, and the ability to re-edit previously drawn shapes like vector editors can.

True 3D Voxel Editing and beyond


Courtesy of Sevens Heaven. See more of his creations here. Appearances can be deceiving - this is what you really see if you get inside the Galaga arcade monitor and view from the side.
    What Photoshop did for 2D graphics, a hypothetical program would do for 3D. Instead of painting on a flat 2D canvas, structures would be 'sculpted' using billions of 'voxels', which is a tiny cube equivalent of the square pixel. There's nothing really like it out there, and no wonder why either, as such a program would kill memory and CPU in one shot.

    Oh that's not to say some haven't tried. The closest realization of the idea would probably be something called 'ZBrush'. It's a very curious program which can produce stunning results. However, it doesn't use true voxels, but instead uses pixels with a particular depth value. That means surfaces are usually one sided so you can't view them from behind (or place one 'voxel' behind another for that matter - it's still a 2D array after all).

    Of course, a step above even true voxel painting would be to incorporate the versatility of re-editable 3D vectors with the flexibility of painting voxels (a 3D equivalent of the unification of vector and bitmap editing as mentioned in the previous section). Along with a decent 3D mouse interface, creativity would be completely unbounded. I don't expect anything like it in my lifetime, that's for sure.




Converting from MIDI to MP3/WAV is essentially trivial. You just record what comes out. But doing the reverse - converting from WAV to MIDI taxes the strongest AI algorithms, and is beyond our level of science for now.

7: Music/sound analysis


In general, music information retrieval is very useful to automatically classify, index, search, and analyse music. Possibilities include translating MP3 to score/MIDI (which is still very tricky), and individual instrument extraction to use in a new composition. Signal analysis is also useful for speech and voice recognition of course. But what else could a zippier CPU do for us here?

Well, there are services out there that attempt to look for similar music to your favourites based on various attributes. But that requires a lot of CPU time, as exhaustive pair-wise comparison of large music databases are required. Yes, techniques such as locality sensitive hashing can be used to reduce high dimensional data to a more compact form, but these can be difficult to implement or maintain, and are generally a kludge which won't necessarily be as accurate as sheer brute force.

One of the fundamental techniques used before analysing a sound is to first split the signal into a frequency spectrum. This is usually done with STFT/FFT techniques, but with an infinitely fast CPU, one can analyse all possible sets of frequencies, amplitudes, and offsets of individual sine waves, mix them, and see which combination produces a result closest to the given signal window. Some signals/sounds may require one or two sine waves, whilst others may require hundreds or even thousands of mixed sine waves (each with their own amplitudes, phase and frequency) to come close. It would be computationally prohibitive, but apart from being simpler, there's also the chance that brute forcing like may at least partially overcome the 'uncertainty principle' where's there's a compromise between frequency and time resolution.



NEXT PAGE... (3)

Page 1 | 2 | 3






Most pictures on this page are copyright of their respective owners apart from
where no attribution is made - where they are copyright 2008 onwards Daniel White.