The future of Blender

100 GHz mean 3mm in vacuum. Wait states between NEARBY parts of the SAME chip. Tens of wait states when exiting the chip. For small circuits like super fast ADCs or frequency dividers the thing is OK, for anything else is like using a F1 race car to go along a road with speed bumps every 20 meters.

As far as quantum computing, it is a business like fusion: always x years in the future.

The only things that keep growind forever are cancer cells.

It seems like the logical limit of human/machine interaction to me

​Who is John Galt ?

Take a look at Memristors; something speculated to exist some years ago, but only relatively recently discovered.

In today’s computers, memory is a bottle neck; Memristors have the potential to offer a solution. Cheaper, much, much faster and more versatile.

While quantum tunneling will certainly produce a bottle-neck for traditional silicon based chips, Intelbelieves it can go as low as 5nm.

In addition, it isn’t all about speed; speed is not the magic-wand, it creates heat which is dangerous to chips. Already, there are alternatives to faster speeds, adding more cores onto a chip produces more performance potential; this requires software venders to actually produce code that takes advantage of these additional cores. Software like blender, 3dsmax, photoshop are coded to take advantage of these cores.

Sadly, adding cores onto a chip isn’t a parallel to what manufacturers have done with adding transistors; adding cores doesn’t automatically increase performance as running code in parrallel only works if the answers to the various snippets can be used; when they can, it moves on, where not, the code has to be rerun. It requires clever coding; and my understanding of that, is by both intel at the processor level and software venders too.

My point is that physics isn’t the problem, but rather will offer the solution; it has already allowed us to recognise a future problem, thus allowing us to seek a solution(s).

Like all things to do with computing, Blender has no future. In 20 years time it will be as obsolete as Aldus PageMaker.

And it is a mistake to focus too much on processing speed, which isn’t the issue. The bottleneck - which is fast disappearing - is connectivity. In the future, all processing-intensive applications will be run from the cloud, on tablets, with touch interfaces. The mouse will be history. There will be no distinction between rendering a picture and using a render farm, the cloud will offer instant scalability. There will be much less of the current duplication of effort in modelling as models will be a commodity, cheaply available through the application and “parameterised” for easy adjustment. There will be more interaction among users, buying and selling in the application, and the touch interface will be more immediate and pleasant. Physics simulation will be an integral part of the interface, making object arrangement easier.

Whoever creates the application that will be dominant in this new domain, will make a lot of money… And it could be open source, as the income will be from trading and other services, not the use of the app itself.

Nice reference :stuck_out_tongue:

freezing, freezing, freezing… :(((

after 10 years of Blender use in my work (nearly 8 hours per day) I have now a big fear…

oh my God… the UVing is full of bugs :frowning:

Report them to the bug tracker then.

Why? Developers should be able to read Endi´s mind by now…

Not likely, they are still developing Blender so far :stuck_out_tongue:

Well i think in a few years blender would need to be a more guided program, instead of memorizing all key strokes.
Its almost like the old Word Perfect 5.1 ( <shift> F7 = printing menu ).

So when you draw a poly line, on screen buttons will appear around your cursor, to smooth it or to use it as a camera or object path etc., also help buttons that explain those possible next steps.
The Gui System has to be slightly smart, must understand the mode you are in to it predicts next steps you can do, those are as a circle around your mouse pointer, mouse wheel to select it left click do it, right click get help on the subject. No more keyboard shortcuts, no more side menu’s… the screen is more yours. Some shortcuts will still exist but they are now by default voice controlled, only for people who are disabled there is still the keyboard option but most people will use mouse with some voice addition.

  • for example after adding lights and move objects help content would focus on positioning like align with other objects
  • when you edit a mesh click once for surface asking for add more surface or select line or point
    after the next surface selection a popup would apear to select an entire row in this orientation or select everything surounding this surface within a flat space, another option would be to grow.
  • this interface would be like a circle around your hair cros, middle mouse wheel would rotate selections and clicking mousewheel would select it.
  • on Android there is no mouse wheel but there will be one emulated in the lower screen.

-essentially this wheel will contain all commands, each explained, and even with referrals to youtube examples.

  • the whole manual part is strongly related to blender version, and lots of small demos made by the public get into it by a voting system included in the manual. The manual includes language settings for breef explaining.
    -ofcourse blender itself will have an advanced record editing option so it will be easy to show what you do, and at what speed you like to play a demonstration (the watcher can slow the speed down)… Most likely voice to text is used in the manual, and from there text to voice in the persons native languge)… As this text is standard for each movie and more people watch it from different languages people can improve language translation… Note a big part of this will be created by Google its only the integration of language into a new help tool that is new.

  • with the explaining of language of how to use tools, more important even there will be the same help tool to explain python for blender.
    -after python routines examples, the same goes for blender internals; this will improve developing speed once more people get to understand the core.

  • a next future will be realtime 3d scanning of objects as we now see blender running on mobile phones.

  • a global library with scanned blender models, content is stored decentral and exchanged through utorrent like protocols.

  • with advancement of 3d scanning from movies, people make vacation movies using blender, and do all kind of after work with it.
    these movies start as 2d movies (but blender will be able to record with dual cams too) and are then reprocessed by blender.

  • Blender will detect light conditions from movies.

  • Blender will have a true physics engine;

  • Combined effects with real object recognition, tell blender its a chair and say explode chair for example (blender will understand voice commands as a shortcut to the other menu system.

  • more render engines will join with blender

  • blender will have a peer to peer mode to create a next level second life.

  • people will be able to use blender as a tool to connect with other people this way (they first scan their room, then connect to someone else)

  • the power of blender will become to blend new areas and to include other graphic related manipulation 2D 3D or 4D.

  • blender 3d scanning will be used not only the standard for 3d printing but also for medical scans, open source is the right way to develop for everyone, and its a major goal for a growing group of developers; especially rapid growth for those countries or companies that give developers money for their free attribution. Especial industrial companies see the benefit of open source development, faster better products. Industrial companies are now more often 3D print companies (~15 years)