Plugin to display an animated piano-roll to exhibition

Got a great idea for the future of LMMS? Post it here.
Forum rules

Make sure to search to see if your idea has been posted before! Check our issue tracker as well, just to make sure you are not posting a duplicate: https://github.com/LMMS/lmms/issues

Hello guys, I am starting to get a little better on the LMMS and I want to post videos on the youtube, but if I choose to record the normal piano-roll, it can't show all the keyboard and it can show only instrument per time, if I choose to record the song-editor, all is so small that it becomes boring. So if you could make please implement some plugin that display the music created on the LMMS in this way. Something like:

1- I press "Display animated piano-roll";
2- I choose some options like "Show music-sheet from one instrument" or "hide instrument X", "make instrument X notes blue", etc;
3- I choose some displaying style, for example "real piano roll", "colored lines", this part should be extensible to allow other guys to create different ways of exhibition;
4- When I press the play button it could show something like these videos:

http://www.youtube.com/watch?v=YX4-nexuTN4
http://www.youtube.com/watch?v=npVO2uuYoJs
http://www.youtube.com/watch?v=VF4MxYXZeLU


Thanks for the attention :)
It would certainly be a nice feature for some
In theory it seems like somebody out there should be able to write a parser script for LMMS's .MMP(Z) files. Should be simple to decode as they're just a specialized type of .XML file. (Feel free to open a simple song saved as .mmp with Notepad++.) The animations wouldn't be from LMMS, but from use of other software.

If some programmer took the time to figure out the protocol, something like a Python script could be written to produce an animation in Blender or similar software from the MMP(z) data. Then it would be a matter of matching up the video output to the time of the music exported from LMMS. Depending on how the parser is coded, not only could it produce "ready-made" visuals of various types (like an animated piano roll of different colored flying bars) - but in theory you could instead have the timeline data interpreted to drive or trigger the state of various rendered objects so you could produce animations comparable to Animusic.

Unfortunately I'm not a programmer myself. (I lack the patience for it.) But considering what is known, it's something that should be technically possible if somebody with the right skills actually wanted to do it.