Page 3 of 4 FirstFirst 1234 LastLast
Results 41 to 60 of 68
  1. #41
    Member
    Join Date
    Sep 2012
    Posts
    2,424
    meh, now your just being vain
    as ever developing & at least they are not alike piiiiiiiiiii... "... i'm just talking about dirty Frank!..."
    gossip boy



  2. #42
    Member Gnaws's Avatar
    Join Date
    Jul 2008
    Location
    Las Vegas
    Posts
    349
    Originally Posted by Thenewone View Post
    ...it only taints...

    I hear ya. We just have to acknowledge that Blender will always be considered a "taint". #StayStrong



  3. #43
    You mention the resistance to change in the Lightwave community and how Blender is different.
    I hate to tell you this, but we had the same kind of whiners in the Blender community who bitched their heads off about us not needing a new renderer (Cycles) or new modelling tools. We had the same "Blender is good enough as it is and should never be improved or changed" attitude here. Thankfully, they were ignored.



  4. #44
    Member colkai's Avatar
    Join Date
    Dec 2011
    Location
    Staffs, UK
    Posts
    681
    Personally, I got tired of the "just wait for the next release..it'll be awesome" a long time ago, thus my switch to Blender. As a hobbyist, I couldn't justify keep throwing money at something that never produced what was claimed, or switched development route ONCE they had your money. Someone posted "give us your money now for a release sometime..whenever", that alas was all too true.
    I am sure if folks are making money off it and have infinite patience, they can be laconic. Blender has turned out for me to be a huge surprise in it's power and flexibility, having the modelling tools I often wished LW had and then some.



  5. #45
    Member
    Join Date
    Mar 2002
    Location
    Halifax, Nova Scotia, Canada
    Posts
    46
    Just to answer the main topic question, I personally understood that Newtek's management's decision to not let the original programming team come up with the next-gen Lightwave (now Modo) is the company's biggest mistake. Whatever truly happened between 2002-2003 behind closed doors at Newtek, we will never know.

    I knew a friend who was hired by Newtek to work with the original programming group (ie Allan Hastings, Stuart Ferguson, etc.). He worked there for a year. Later, he told me that the programming team was leaving Newtek. Much later, Luxology was born and Modo was introduced.

    Just a thought...



  6. #46
    Member Safetyman's Avatar
    Join Date
    Mar 2010
    Location
    Maryland, USA
    Posts
    1,235
    Originally Posted by Ace Dragon View Post
    Bumping this thread, because there was hype a while ago over the supposed release of Lightwave Next in three months (which would place the date around now).

    Well...
    http://forums.newtek.com/showthread....arly-April-etc

    Even among a number of avid fans, the morale of their community is going down and some are starting to jump ship (this indicates that NewTek simply can't afford to hold the release much longer if they don't want to start losing the userbase). This is clearly becoming make or break for the company. If LW Next isn't all as it's advertised to be, the application risks joining Truespace and XSI in the graveyard.
    I don't understand why LW3DG won't release some sneak peeks or engage the community in order to keep them interested. Are they so full of themselves to think that folks will just wait and see, or is it that they really don't have anything stable to show? Could be something else, but they should at least have a thread keeping everyone updated on progress. Something... anything.

    BTW -- I think it's funny how often Blender gets mentioned on the LW forums... in quite a lot of the topics.
    Quicksand has no sense of humor



  7. #47
    Originally Posted by Safetyman View Post
    BTW -- I think it's funny how often Blender gets mentioned on the LW forums... in quite a lot of the topics.
    It's a good thing they do, I'm sure it's led a lot of folks to blender, like myself.
    I don't think I will never pay for a LW update again!

    The problem isn't the software or the developers, it's Newtek, you can't depend on them as a company.



  8. #48
    Member Farmfield's Avatar
    Join Date
    Dec 2011
    Location
    H'sing Island
    Posts
    1,759
    Originally Posted by Tea_Monster View Post
    You mention the resistance to change in the Lightwave community and how Blender is different.
    I hate to tell you this, but we had the same kind of whiners in the Blender community who bitched their heads off about us not needing a new renderer (Cycles) or new modelling tools. We had the same "Blender is good enough as it is and should never be improved or changed" attitude here. Thankfully, they were ignored.
    ^ this. But with a louder voice.

    I was involved in the thread at Blenders forum that finally made them budge, implementing the warning to save popup before closing Blender. The resistance to implement that was insane. People were so fukn mad at us who drove the question, calling us idiots and that it was our own fault if anyone ever a scene X'ing down a Blender window by mistake. o_O

    That kinda attitude takes me over the edge, those people are complete r... oh, sorry, I forgot the word is no longer socially acceptable to use, hehe, but there's few words that describe them better...

    So yeah, that sh!t is very alive in the Blender community - and of course in many others as well. I gotta say I've seen very little of it in the Houdini community, though, and then we're talking about a company who can just deprecate whole systems from one version to the next, hehe - but they've had this attitude so long, the user base are use to it, thus accepts it fully...



  9. #49
    Member SaintHaven's Avatar
    Join Date
    Dec 2008
    Location
    Los Angeles, CA
    Posts
    2,237
    Originally Posted by Ace Dragon View Post
    This is clearly becoming make or break for the company. If LW Next isn't all as it's advertised to be, the application risks joining Truespace and XSI in the graveyard.
    Eh... Not really. NewTek's biggest source of revenue isn't coming from LW, while they obviously want LW to do well they are also not going to need it to in order to survive financially.

    Current LW still has a massive userbase, especially in certain parts of the world. It also as a fairly decent 3rd party plugin ecosystem. Granted, yes things are diminishing over time but its not really all that critical just yet.

    NewTek also wouldn't just can the software if they did not want to develop it anymore, but rather shop it around so some other group could pick it up.

    Truespace wasn't popular enough for Microsoft to do that (plus as a corp they just don't need to care about such losses), and well XSI was bought specifically by Autodesk to remove the competition (after keeping a bit of the talent). It's quite disgusting that Autodesk keeps getting away with that kind of practice. Anyways, LW via NewTek is not in the same boat or situation.

    Originally Posted by Tea_Monster
    I hate to tell you this, but we had the same kind of whiners in the Blender community who bitched their heads off about us not needing a new renderer (Cycles) or new modelling tools. We had the same "Blender is good enough as it is and should never be improved or changed" attitude here.
    Originally Posted by Farmfield View Post
    ^ this. But with a louder voice.
    Yep, there was even vitriolic push back against tabs in the tool bar (guess they wanted endless scrolling) and even the optional inclusion of pie menus. It was literally insane.
    Last edited by SaintHaven; 18-May-17 at 14:54.



  10. #50
    Member
    Join Date
    May 2011
    Posts
    242
    Originally Posted by Farmfield View Post
    I was involved in the thread at Blenders forum that finally made them budge, implementing the warning to save popup before closing Blender. The resistance to implement that was insane. People were so fukn mad at us who drove the question, calling us idiots and that it was our own fault if anyone ever a scene X'ing down a Blender window by mistake. o_O
    Unfortunately though, having the console window open and clicking it's close button has caused me several headaches, and is also very inintuitive since the graphical window doesn't do this.
    To clarify, I would expect a click on the console window's close button to close only the console window. I.e. I don't expect it to close blender, be it with or without a warning dialog. I'm not sure if there is preference for that since I haven't looked, but even if there is, the default then is clearly wrong.
    Last edited by xol; 19-May-17 at 09:52.



  11. #51
    Originally Posted by xol View Post
    To clarify, I would expect a click on the console window's close button to close only the console window. I.e. I don't expect it to close blender, be it with or without a warning dialog. I'm not sure if there is preference for that since I haven't looked, but even if there is, the default then is clearly wrong.
    For all I know (not being a win32 guru), that's not really possible, because Windows controls that window and it represents the Blender process. The fact that this window even exists is a bit of an oddity for a GUI application on Windows. You usually either have a GUI or a console application. However, console applications can create windows, too. That's what Blender does. The reason must be that you can use platform-independent stdout/stderr (console output) that way.

    While I'm all in favor of a more forgiving computing experience, unless you're hitting CTRL+S every few seconds, you're not using Blender correctly. The program could crash at any instant and then you won't get a "would you like Blender to crash without losing your work?" prompt. It's all the more important because of Blender's incomprehensible auto-save feature. Also, don't forget about your fake users...



  12. #52
    Member Ace Dragon's Avatar
    Join Date
    Feb 2006
    Location
    Wichita Kansas (USA)
    Posts
    27,892
    It's actually possible for a program to at least contain a crash so it doesn't nuke the whole session.

    I've seen this in action with Genetica 4.0 and allows me to at least save the project before exiting (even though your mileage in terms of everything working can vary a little).

    Citing that, it would theoretically be possible for Blender to detect a code failure and allow the chance to save the .blend file as Blender shuts down (and give you a short transcript of the crash as well). The only issue then is the amount of work that might be needed to have something like this.

    While on the subject, may I suggest something in the preferences that users can set that would abort the render-preparation process if memory usage exceeds a defined amount, as going deep into swap can be too easy with things like microdisplacement (and before someone boos that idea, let me say that nothing in this reply is suggesting things that would genuinely slow down the user's workflow).
    Sweet Dragon dreams, lovely Dragon kisses, gorgeous Dragon hugs. How sweet would life be to romp with Dragons, teasing you with their fire and you being in their games, perhaps they can even turn you into one as well.
    Adventures in Cycles; My official sketchbook



  13. #53
    Member Farmfield's Avatar
    Join Date
    Dec 2011
    Location
    H'sing Island
    Posts
    1,759
    Originally Posted by xol View Post
    Unfortunately though, having the console window open and clicking it's close button has caused me several headaches, and is also very inintuitive since the graphical window doesn't do this.
    To clarify, I would expect a click on the console window's close button to close only the console window. I.e. I don't expect it to close blender, be it with or without a warning dialog. I'm not sure if there is preference for that since I haven't looked, but even if there is, the default then is clearly wrong.
    LOL, that freaks me out with Nuke, and The Foundry has refused to fix it as well. the upside is Nuke has an insanely good autosave feature, so doing it, you never lose anything - or if you do, it's absolutely minimal. Just awesome.

    Oh, and on that side note: Why doesn't all apps have an autosave like that? It can't be much of a mystery to figure out how the hell the Nuke dev's created it - even Houdini is using the normal save function for autosave, so if you got a huge scene, it'll lock up the app for 10 seconds or whatever, saving it - maddening...



  14. #54
    In a program like Blender, which doesn't really do structured exception handling, pretty much all errors are of the unrecoverable kind (i.e. memory has become corrupted). In theory, you could try saving the document but there's a significant chance that the data is corrupted as well.

    As for the "memory warning": It's always easier for programmers to just assume the user has an infinite amount of memory. We don't really know how much memory we're really using, unless we're explicitly keeping track of it - and we generally don't. I'm not saying that this is good practice, but I'd still advise users to install an infinite amount of memory, if possible.



  15. #55
    Member Ace Dragon's Avatar
    Join Date
    Feb 2006
    Location
    Wichita Kansas (USA)
    Posts
    27,892
    Originally Posted by BeerBaron View Post
    As for the "memory warning": It's always easier for programmers to just assume the user has an infinite amount of memory. We don't really know how much memory we're really using, unless we're explicitly keeping track of it - and we generally don't. I'm not saying that this is good practice, but I'd still advise users to install an infinite amount of memory, if possible.
    Blender can detect if you have a compatible GPU for things like Cycles rendering (and even just refuse to start in favor of a warning if it knows your GPU is ancient), how hard could it be for Blender to know how much RAM you have?

    Even then, I was talking about a user-defined max memory setting for the rendering process (most of the time, you can go a little bit into swap at the end of the preparation process, and for the cost of a time penalty there still have a normal rendering session without slowdown).

    I have 24 gigs of RAM for instance, I might want to set the pre-rendering stage to abort if the swap goes over 4 gigs to avoid any chance of a bad lockup (now I do have an SSD, but I don't use it as the boot drive).
    Sweet Dragon dreams, lovely Dragon kisses, gorgeous Dragon hugs. How sweet would life be to romp with Dragons, teasing you with their fire and you being in their games, perhaps they can even turn you into one as well.
    Adventures in Cycles; My official sketchbook



  16. #56
    Member
    Join Date
    May 2011
    Posts
    242
    Originally Posted by BeerBaron View Post
    For all I know (not being a win32 guru), that's not really possible, because Windows controls that window and it represents the Blender process. The fact that this window even exists is a bit of an oddity for a GUI application on Windows. You usually either have a GUI or a console application. However, console applications can create windows, too. That's what Blender does. The reason must be that you can use platform-independent stdout/stderr (console output) that way.
    I only did tiny apps in win32, and most of that was years ago. IIRC I did programmatically create console window at some point. I don't remember though if the API allowed to intercept console window events or not, and I'd be very surprised if not. Or rather, I'd be surprised if there is no way to to intercept and suppress termination of application even if caused by console window. Almost tempted to look it up, but that would take some time.

    Anyway from what I remember a console app is simply an app that has a flag in the PE header which is set is set appropriately that directs the program loader to initialize a console window for the app in question. Else if one did not add that flag (through "-Wl,--subsystem,windows" in GCC) one would then need to explictly initialize a console window through the API (AllocConsole() function, just googled).

    Haven't read this yet, but I found it interesting: https://msdn.microsoft.com/en-us/lib...(v=vs.85).aspx



  17. #57
    Originally Posted by Ace Dragon
    how hard could it be for Blender to know how much RAM you have?
    First of all, that's not the problem you posed. You're asking for checks on how much memory is being used (in multiple places) and then have the program gracefully abort whatever operation is underway. It's not that simple.

    Secondly, never ask "how hard is it to...?", but ask "can you be bothered to...?". Generally, the answer is "no, I have a dozen other things to do that are more important". Buy more RAM, watch your RAM usage. Again, not saying that it's "good" that way, but worse is better.

    Why aren't you complaining to your OS vendor that the system becomes unusable when paging, anyway? It doesn't have to be like this, it's not a law of computing. It's just that no one ever bothered to implement the facilities to make running out of memory a more pleasant experience. Maybe it's a conspiracy on behalf of the DRAM lobby...

    Originally Posted by xol View Post
    Anyway from what I remember a console app is simply an app that has a flag in the PE header which is set is set appropriately that directs the program loader to initialize a console window for the app in question. Else if one did not add that flag (through "-Wl,--subsystem,windows" in GCC) one would then need to explictly initialize a console window through the API (AllocConsole() function, just googled).
    Yes, it is like that. You can't use stdout until you AllocConsole though, so you'll do it at the start anyway. Might as well go with the PE flag. If you get a window handle for the console, you can fuck around with it (i.e. disable the button).

    Haven't read this yet, but I found it interesting: https://msdn.microsoft.com/en-us/lib...(v=vs.85).aspx
    As far as I can tell, this is just a notifier, it doesn't let you prevent process termination (see also the stackoverflow link above).



  18. #58
    Member
    Join Date
    May 2011
    Posts
    242
    Originally Posted by BeerBaron View Post
    Yes, it is like that. You can't use stdout until you AllocConsole though, so you'll do it at the start anyway. Might as well go with the PE flag. If you get a window handle for the console, you can fuck around with it (i.e. disable the button).
    Not always do you/users want a console window cluttering the taskbar and/or view, getting in the way and stuff, which of course is the reason for the dynamic creation API. Since stdout can be redirected to files and such, there shouldn't be a need for an existing console window in order to write to stdout. Do imagine blender with an obligatory console window, always, and forever...
    I like that close-button disabling ability at least. Someone with build env reading this, please add. Preferences entry or not, just add, please.

    As far as I can tell, this is just a notifier, it doesn't let you prevent process termination (see also the stackoverflow link above).
    I don't know, still haven't read it through. Considering the SO page you linked where it was mentioned, I have to take you word(s) for it.



  19. #59
    Member Ace Dragon's Avatar
    Join Date
    Feb 2006
    Location
    Wichita Kansas (USA)
    Posts
    27,892
    Another bad sign for Lightwave? It has officially been dropped from the pipeline for the new Star Trek series.
    http://forums.newtek.com/showthread....highlight=trek

    The reason I mention it as a possible bad sign is because its purported use was a key argument in the idea that the app. still had a potentially bright future, so much for that.
    Sweet Dragon dreams, lovely Dragon kisses, gorgeous Dragon hugs. How sweet would life be to romp with Dragons, teasing you with their fire and you being in their games, perhaps they can even turn you into one as well.
    Adventures in Cycles; My official sketchbook



  20. #60
    Originally Posted by Ace Dragon View Post
    Another bad sign for Lightwave? It has officially been dropped from the pipeline for the new Star Trek series.
    http://forums.newtek.com/showthread....highlight=trek

    The reason I mention it as a possible bad sign is because its purported use was a key argument in the idea that the app. still had a potentially bright future, so much for that.
    Don't see how or why that would be a valid argument for or against LW, other than giving Trekkie diehards bragging rights if they use LW. I'm not surprised LW isn't being used. Canadian studios have their own pipeline that revolves around Maya most likely. Considering Voyager, DS9, and Enterprise were made with LW, it's unfortunate but not necessarily bad news. The industry has moved on, there are a lot of FX houses in Canada that have their existing pipelines and can do it cheaper. LW has been so far removed from the conversation over the years that one show wasn't going to save it, imo. That's the real issue not whether the next Star Trek dropped them or not. They need to get back into the conversation, but considering the vast amount of studio knowledge and pedigree Maya and Houdini have i don't see LW really breaking into TV at the same scale as they once did. Effects works is getting cheaper (because its outsourced) the cost effectiveness of using something like LW, is no longer a valid argument when you just outsource the whole thing to Canada and they in turn to Asia for arguably better results and less overall cost.

    LW took off because they were cheaper than the competition and they had a phenomenal renderer for the price along with an editing suite to boot. The cost of the software doesn't even factor in anymore.



Page 3 of 4 FirstFirst 1234 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •