Well, again, a simple google search should provide fairly relevant links, but in short: You can think of a generator as an extended iterator.
Here’s an example:
>>> def myGenerator(n):
... i = 0
... while i < n:
... yield i
... i += 1
>>> it = myGenerator(10)
>>> for i in it:
>>> it = myGenerator(2)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
A typical sequence iterator would simply provide the next item in some predefined sequence. But a generator has to actually “generate” the next value, by running the defined generator function until it “yields” the required value - hence the yield keyword.
So, when yield is hit, the generator function is frozen - the state of it is preserved, so that on the subsequent call to next, execution can continue from where things left off, to yield the next value.
When all yield statements are cleared in the generator function, and you hit the end of the generator function (or a plain return keyword that you explicitly place on one line or another), a StopIteration exception is raised, as with all sequence iterators when they reach the end of the sequence. Typically, this exception is caught in the for loop, but if you’re calling next on an iterator manually, you need to catch it with a try-except block.
Now, I realize that this small example doesn’t really illustrate the true power of generators, but there many benefits to A) Having a function that can preserve state. B) Generating values when they’re actually needed, rather than having to waste a whole bunch of memory on really long sequences.
Interesting/relevant note: In python 3.x (which is what the BGE runs) the well known range function is actually a generator function.
There are also higher concepts lurking in the background, but I don’t want to comment on them here, because, frankly, I’m still trying to fully understand them myself - Although, I’ve learned enough to know that they are very, very powerful.
Hope this helps.