Hi Frederik,

When using UF6 (and the previous versions also) my computers CPU (Intel I5 with 4 cores) is working at 100 % while my GPU (NVIDIA GeForce GTX670) is doing ... right ... nothing at all.

There are softwares - I'm refering to "SETI@home" (*) - that are doing math calculations and are using not only the CPU's in your PC but also the GPU as you can see on the diagram below and the calculations of the GPU are much, much faster than the CPU's. One disadvantage = the heat production of the CPU/GPU combination.

5a439626ce22e.png

Is there a possibility that this calculation method will be introduced in UF?

(*) SETI@home is a scientific experiment, based at UC Berkeley, that uses Internet-connected computers in the Search for Extraterrestrial Intelligence (SETI). You can participate by running a free program that downloads and analyzes radio telescope data.

Hi Frederik, When using UF6 (and the previous versions also) my computers CPU (Intel I5 with 4 cores) is working at 100 % while my GPU (NVIDIA GeForce GTX670) is doing ... right ... nothing at all. There are softwares - I'm refering to "SETI@home" (*) - that are doing math calculations and are using not only the CPU's in your PC but also the GPU as you can see on the diagram below and the calculations of the GPU are much, much faster than the CPU's. One disadvantage = the heat production of the CPU/GPU combination. ![5a439626ce22e.png](serve/attachment&path=5a439626ce22e.png) Is there a possibility that this calculation method will be introduced in UF? (*) SETI@home is a scientific experiment, based at UC Berkeley, that uses Internet-connected computers in the Search for Extraterrestrial Intelligence (SETI). You can participate by running a free program that downloads and analyzes radio telescope data.
 
0
reply

The GPU seems an obvious fit for fractal calculations but unfortunately it doesn’t work at all. Almost all consumer-level GPUs perform their arithmetic using 32-bit floating point numbers which is really only enough for the first couple of zoom levels. You need 64-bit floats but only very expensive pro cards provide this. I’ve tested with simulating 64-bit math using multiple 32-bit operations but that turned out to be slower than the regular CPU.

Then there’s the problem that the GPU blocks all graphics work while it’s working on fractals so it essentially blocks the entire computer. Very hard to work around this. All in all, it just doesn’t work out. I know there are many examples of Mandelbrot GPU plotters but if you look closely, none of them provide proper zooming.

The GPU seems an obvious fit for fractal calculations but unfortunately it doesn’t work at all. Almost all consumer-level GPUs perform their arithmetic using 32-bit floating point numbers which is really only enough for the first couple of zoom levels. You need 64-bit floats but only very expensive pro cards provide this. I’ve tested with simulating 64-bit math using multiple 32-bit operations but that turned out to be slower than the regular CPU. Then there’s the problem that the GPU blocks all graphics work while it’s working on fractals so it essentially blocks the entire computer. Very hard to work around this. All in all, it just doesn’t work out. I know there are many examples of Mandelbrot GPU plotters but if you look closely, none of them provide proper zooming.

Ultra Fractal author

 
0
reply
799
views
2
replies
2
followers
live preview
Enter at least 10 characters.
WARNING: You mentioned %MENTIONS%, but they cannot see this message and will not be notified
Saving...
Saved
All posts under this topic will be deleted ?
Pending draft ... Click to resume editing
Discard draft