In order to graduate high school where I come from, we have to do whats called a graduation project. This graduation project consists of a small oral and visual presentation in front of three teachers of your choice who decide whether or not you pass. My graduation project topic was artificial intelligence and my visual product was this handwritten character recognition project using an artificial neural network. You can teach it your handwriting and it commits it to its memory. You do this by providing three examples of each character. Then when you write a letter, it recognizes which one you wrote.
Done totally in gameblender.
ViDEO!
kind of like graffiti for palm
I made this a while ago and have since graduated.
I ‘will’ release a .blend at a later date. Im sure some of you would love to get your hands on the code
cant forget to thank scabootssca and wiseman303 for answering a few python questions.
Awesome! I definitely want to check out that .blend. I think it’s great when people explore the possibilities of the game engine like this. Really nice work.
wow! first of all congrats on pass ur senior exit project, thats what i know it as, i still have to complete mine this year and you did an awsome project. I was thinking of making a blender project but i have no idea how to make it relate to my topic… but anyway good job!
Now that’s something I never would have thought to do with the BGE. I too would love to see how you accomplished that. I imagine it has something to do with recording the path of the mouse while the button is held, then comparing it to the average of the three samples given (though I think you probably used a stylus for the video, but it’s the same concept).
Anyway, expect to get BlenderNationed with this one… and congratulations on your graduation.
Amazing, I always wondered how to do that with Python. Dit you use some external liberary for recognise gestures, or did you write your own code? I’m really curious how you did that. Great job
:eek:Woah. Lol, you beat me hands down, hehe, maybe I should have tried doing this in the BGE too.
Haha, I did a neural network character recognition in python that just spit out values/words in the console as one of my side senior projects in college. Very slow to train it to recognize the characters though. Wow, I would love to get my hands on the code and the .blend too.
:DNow I’ll have to keep in mind that the BGE isn’t just for walkthroughs and games.
the circular loading animation inbetween teaching samples is faster. It was slow because of fraps. I put them in because after clicking ‘next’, you would already be drawing your next sample where you clicked. And so now you have a short time (1 sec or so) until you can start drawing the next sample.