Is it possible to port the game engine to Direct X...

without having to port the entire program (i.e. the non game engine portions)?

I know this isn’t a trivial task, but my graphics chip doesn’t really support Open GL, and I can’t find any 3D engines in my price range (which, considering I have no current income, is $0).

By the way, I’m not making a request here. I just want to know if I can do it myself without having to port the entire thing.

Well you could but you would have to re-write a lot of code in directX. It might be easier to port the entire program considering you would have to dig out a lot functions that the engine and renderer don’t share. Also if porting to Direct X it might make your renders faster. I would say download the source code overview of what needs to be converted as far as the graphics end and look for any math shortcuts. It might be a good idea that you be well orientented in openGl programing as well as directX.

umm

http://www.ogre3d.org/

there are a BUNCH of directx/opengl/software 3d engines, you’ll just need to be willing to code

If you’re too lazy for linux [I assume you aren’t on a mac], and you’re probably too cheap for MS visual studio:
http://www.bloodshed.net/devcpp.html

What card do you have? It must be pretty old, and a pretty decent geforce card is only 30-60 bucks…

Pooba

I don’t think it’s that hard - you need to:

  1. Write a DirectX implementation of RAS_IRasterizer, and RAS_IRenderTools
  2. Write a replacement for ghost (window setup & input handling)

Since he wants to use DirectX, it’s a pretty safe bet he’s on windows :wink:

dwmitch, what card do you have? Even my Intel 82810E supports OpenGL well enough to run Blender.

I’m running an 810 chipset. It handles it well enough for Blender, as long as I don’t install the new drivers (or run Blender at 16 bit color setting with the new drivers, which leads to intolerable banding), but regardless of what setting I put it at the game engine runs slow.

I’m currently looking for work, so I can’t even afford a $30 - $60 card.

umm

http://www.ogre3d.org/

there are a BUNCH of directx/opengl/software 3d engines, you’ll just need to be willing to code

If you’re too lazy for linux [I assume you aren’t on a mac], and you’re probably too cheap for MS visual studio:
http://www.bloodshed.net/devcpp.html[/quote]

Perfect! My brother gave me his copy of visual studio 6 when he switched over to visual basic. This engine simplifies the parts I have trouble with (pretty much all of the initialization stages), but still allows me to advance my programming skills, not to mention the fact that I can see how an engine is written now.

You mean the older drivers didn’t have the texture banding? I’ll have to check that out, what version of the drivers are you using?

Yeah, but beware of buying some video cards. A friend of mine got one wich runs blender’s game engine on like 200 fps and he can even run it on 400 fps if he uncheck some boxes. Anyways that isn’t really cool, because almost all blender games will be ruined when you play them. It would be cool if there was a script to set the max fps of the blender game engine. Is that possible?? :slight_smile:

This is far from original topic, but you can use simple script:

import time
time.sleep(0.01)

This will reduce processor load and framerate depending on parameter of sleep() function and frequency this script is executed.

But making game dependent on framerate is a very bad habit. It can make game unplayable for high speed and also for very low speed computers. We should do as much as possible to get rid off this dependency.

Well, fixing the physics engine would be a start. I still havent figured out a good workaround to the “jumping problem”. Characters who jump jump higher on slower computers…

You mean the older drivers didn’t have the texture banding? I’ll have to check that out, what version of the drivers are you using?[/quote]

The older drivers have banding at 16 bit color, but the Blender GUI runs faster on 24 bit color with the older ones. The newest one makes it intolerably slow on the 24 bit setting.

I don’t know what happened, but when I was downloading the driver I checked to see which chipset I would need it for. It was listed as an Intel 810 chipset. I uninstalled that driver to make Blender work better, and I just checked to see the version number of the default driver, and it said I was using an Intel 82810E graphics controller. Either it came installed with the wrong drivers and I upgraded to the wrong one or I was checking the wrong thing.

Gah! I posted about that a while ago. For jumping, or other instant duration forces, use LinV in a motion actuator or applyImpulse() in Python. Using force is OK for movement that will be sustained over several frames, and NEVER use DLoc or DRot.

If you follow those rules then your game should run the same no matter the framerate.

Thanks wiseman, a lot of posts go through this forum over time, its easy to miss things. There should eb a sticky workaround faq thread:)

Well actually about the max fps set using the lower processor speed would be usefull if you do it like:

  • run the game
  • screen appears showing you a simple cube rotating, then ask how much fps does it run (by displaying the fps with a property and blenderfont)
  • let the user input his fps, and when he clicks next, it caculates the number that has to be filled in, how much to processor speed should be made slower. :stuck_out_tongue:

Just an idea, I think it would be quite nice, like when people fill in 85fps, that’s normal so no slowdown added. When they type 150 fps, a python script input the slowdown into the script that’s needed to reduce 150 to 85 fps :smiley:

There is no need to request user to enter framerate - it can be easily done using Python script and Timer property. It has advantage of being adaptable to actual framerate during game play.

Hmm… I’m using the newest drivers, (version 6.7,) but the texture banding shows up no matter if I’m in 16bit or 24bit mode. (24bit is a lot slower though.)

Intel 82810E graphics controller is the name of the graphics card, 810 is the chipset it uses. You were probably checking different places before and after the install.

There’s a thing called Vsync…

It limits the FPS to your moniter’s refresh rate. Since i run 1280x1024, i have a refresh rate of 60 hertz. Which means Vsync limits all framerates to 60fps. If you’ve turned this off, it will totally mess up blender games, but as long as you have it on blender games will run perfectly fine at 60fps.

Pooba