The people who build proper home cockpits inside real aircraft, especially interfacing with old aircraft hardware, are always very impressive. This takes a lot of time, dedication, and reverse engineering.
But it is also easy to get started at home. Using some 42 inch televisions and old Android tablets, you can replicate a lot of the immersion very easily and have fun, see if you like it. Check out my open source projects and free apps for X-Plane and more details on building your own immersive flight simulator experience: https://www.waynepiekarski.net/projects/xplane.htm
Although be aware, these projects tend to turn more into an adventure building than actually flying much :)
I loved the original Glider game on the old Mac. Back in 1997 when I was in undergrad I made my own port using ASCII art to run on the VT100 terminals we had at the time. When I started learning Android, I ended up writing a VT100 emulator with the NDK to bring it back to life on modern devices: https://github.com/waynepiekarski/android-glider
Yes, you can hide the instruments. But then it changes the projection and then nothing lines up across the displays. You need to hide the instruments on all or none of the displays.
Perhaps you could add a fourth display below center for the instruments, and have the three mains just show out-the-cockpit views. The rendering method you're using seems like it would generalize that far, with the simulation running on the VM displaying the instrument panel, and the center head synced the same way the sides are now.
(Don't mind me, though - I'm just letting my old flight-sim bug live vicariously for a while through your extremely impressive setup...)
Unfortunately I don't have any more space below the center monitor to put another display :) or it gets in the way of the yoke. I tried to fit a smaller LCD display in there but it just got in the way and slowed X-Plane 11 down with an extra display to render.
This should be possible with any emulator for old school systems, like 286 with DOS, C64, Atari, etc. All you need is the ability in the game to "look left" and "look right" somehow, and tune the field of view for correct projection. If the game can't do that, then it might not be possible.
Also, the hacks in the blog post won't work as well for anything newer than DOS, like Windows games, since there is an OS in the way and writing over memory locations gets a lot harder. You would need to approach the problem differently.
It only took me a few days, maybe 20 hours total? ... I started this back in January but it took me a while to capture the videos and write it up since I've been reworking the physical frame to make the monitors fit closer.
1/4 of the time was spent running GDB trying to find the memory locations and work out the encoding scheme.
1/4 of the time was spent getting familiar with DOSBox, building on Ubuntu, and syncing the data around via UDP.
1/4 of the time was spent trying to automate the start up so I didn't need to press a hundred keys manually to configure all the displays each time I restarted it.
1/4 of the remaining time was capturing nice videos and writing it all up :)
I've done hacks like this before ... back in 2000 I worked on the team that built ARQuake but we had source code http://www.tinmith.net/arquake/
Really cool. So if there 3 separate instances running, how do they stay in sync? Surely these 18 bytes are not everything? So you feed the same input to all 3 instances at the same emulated time?
The left and right FS4 instances are paused at startup, while the front view is the master. The FS4 renderer seems happy to have the XYZ HPR values overwritten behind its back and it refreshes the display constantly with the latest values. So the gauges don't work on the left/right displays - ideally I'd get rid of them, but removing the instruments causes the left/right 3D view to not align with the front display any more.
I imagine with more studying of the memory locations, you could sync over the instruments too. Or if you really wanted to, you could just grab the video memory for the instruments on the master and blast that over the network too. Displays were so small back then that you could easily compress and send this.
Looking at the gauges on the side monitors, things like vertical speed read zero. I guess the programs running the side views are in a lookaround mode with the viewpoint being banged in manually by the network hack. Only the center view is the master actually simulating the flight.
Actually the projection is correct, but the camera is not exactly at the center of projection, so it appears that the lines are bent across the monitor edges. However, if you place your head at the exact center of projection, they look correct. I'm actually doing three separate projections, instead of just stretching one image across three displays. The GoPro field of view was not able to capture everything, so I had to pull it back a bit, and so its not perfect. A spherical camera would have done a better job here.
Ah of course! Thanks for that explanation. It already looked way closer to correct than most games can manage as well.
Man I'd love to try this. I played a lot of Flight Simulator 5.1 back in the day, which actually included the whole Flight Simulator 4 world as well - you could choose to load it in from one of the menus. The untextured terrain was so much harder to read though!