Lemmings Forums

Lix => Lix Main => Topic started by: Simon on February 26, 2015, 02:10:37 PM

Title: Porting Lix to SDL, loose ideas
Post by: Simon on February 26, 2015, 02:10:37 PM
This might be of interest only to our small group of Lix source hackers.

I'm playing around with SDL 2. (EDIT: Using Allegro 5 for a rewrite attempt since March 2015. The remaining post here is not altered.) The grand idea is to port Lix from Allegro 4 (called A4) to this library. A4 is getting old, and porting Lix to the current A5 would be similarly elaborate.

A4's BITMAP* is a RAM bitmap. Optionally, it can be a VRAM bitmap, but I don't use that anywhere in A4 Lix.

SDL (by which I mean SDL 2.0) works with three datatypes instead of one: SDL_Surface*, SDL_Texture*, and SDL_Renderer*. The latter comes as a window renderer and a software renderer. From the documentation I've digested, these types are used in the following manner:
Now, what are good replacements for the various data structures used in Lix?
Alternatively, maybe this:
Everything was easier when I wasn't caring about drawing speed, but this is one reason why efficient game programming is hard. :>

Image: SDL hello world with Texture (http://asdfasdf.ethz.ch/~simon/etc/lix-with-d.png)

-- Simon
Title: Re: Porting Lix to SDL, loose ideas
Post by: NaOH on February 26, 2015, 06:17:50 PM
Why not let both Cutbit and Torbit use Texture?
Title: Re: Porting Lix to SDL, loose ideas
Post by: Simon on February 26, 2015, 06:23:47 PM
Drawing Texture onto Texture isn't what you "should" do, in that there is no library function to do it.

If the gut feeling is that both should be Texture, then probably Cutbit should use Texture and Torbit should draw via SurfaceRenderer -> Surface -> Texture.

-- Simon
Title: Re: Porting Lix to SDL, loose ideas
Post by: NaOH on February 26, 2015, 06:36:08 PM
I believe SDL_RenderCopy() would work.

But I think, generally one shouldn't be rendering to a texture directly, only rendering textures to the screen. So maybe the intermediate Torbit should be dropped altogether, and Cutbit should perform the correct modular arithmetic when rendering?
Title: Re: Porting Lix to SDL, loose ideas
Post by: Simon on February 26, 2015, 07:24:36 PM
Can't dismiss this idea, it's certainly possible that the information architecture of A4 Lix is flawed when using SDL.

I'm reluctant to do this because Torbit does some modulo calculation external of drawing: computing distances. Also, I wouldn't want the graphic objects (each lix, each terrain tile, i.e., things that refer to some Cutbit and sit somewhere on the map) carry the information about torus.

Right now, Map inherits Torbit. If Torbit is scrapped, Map will continue to be important. I'd therefore like to have the torus information at least in Map.

-- Simon
Title: Re: Porting Lix to SDL, loose ideas
Post by: ccexplore on February 26, 2015, 08:11:58 PM
The SDL 1.2=>2.0 migration guide (http://wiki.libsdl.org/MigrationGuide) on the wiki you linked to might be a good conceptual resource.  I'm certainly no expert in this area, but based on what you said in the first post, A4 Lix seems a bit similar to how the migration guide describes SDL 1.2:

Quote from: http://wiki.libsdl.org/MigrationGuide#Video
For 2D graphics, SDL 1.2 offered a concept called "surfaces," which were memory buffers of pixels. The screen itself was a "surface," if you were doing 2D software rendering, and we provided functions to copy ("blit") pixels between surfaces, converting formats as necessary. You were almost always working on the CPU in system RAM, not on the GPU in video memory. SDL 2.0 changes this; you almost always get hardware acceleration now, and the API has changed to reflect this.

So conceptually speaking, I expect some similarities in going from A4 = > SDL 2.0 as what the migration guide obstensibly covers (SDL 1.2 => SDL 2.0).  If so then the guide may prove useful for this topic.
Title: Re: Porting Lix to SDL, loose ideas
Post by: ccexplore on February 26, 2015, 09:37:03 PM
It might also be helpful for benefit of "other readers" (perhaps just me at this point :-[) to briefly mention here how rendering works currently in the A4 game.  For example, the game has things like the level's terrain (which is "Map" I think?), "sprites" that includes things like the lixes, interactive objects, trapdoors, etc., and UI elements like buttons and windows.  What is the current rendering path for each such thing (in other words, which objects like Torbit/Cutbit etc. does the thing go through until it eventually winds up rendered on the pre_screen?)
Title: Re: Porting Lix to SDL, loose ideas
Post by: Simon on February 27, 2015, 02:53:49 AM
ccexplore, thanks for the link to the SDL porting advice, this sounds very much applicable.

Here's the sketch of the drawing order:

BITMAP* pre_screen; (this lives in RAM, too)
Torbit (owns a BITMAP*) osd; (It's not necessary that osd must be Torbit, BITMAP* would probably do.)
Map (inherits Torbit) map; (This must be Torbit in the current architecture, it's a canvas for drawing gameplay.)
State (owns a Torbit) current_state; (This holds the land, I have several states for networking backup or savestating.)
Cutbit interactive_object, lix_image, ...; (owns a BITMAP* with the entire spritesheet, can make sub-BITMAP*s that only have the desired frame. Cutbits do not change while the program is running! Maybe some more are loaded when new terrain is required from disk, but existing Cutbits don't change anymore.)

When drawing during the normal gameplay:

Clear the map rectangle visited by the current scrolling to the level's bg color.
Draw the interactive objects onto the map.
Draw the landscape from current_state's Torbit onto the map.
Draw the interactive objects onto the map that go in front of the landscape.
Draw the lixes onto the map.
Undraw osd, this removes the mouse cursor from the last time we were drawing and restores its background.
Draw GUI elements onto the osd that have changed since last time we were drawing.
Draw the mouse cursor onto the osd, saving its background.
Blit the map rectangle visited by the current scrolling onto pre_screen, without transparency.
Blit the osd onto the pre_screen, with transparency.
Blit the pre_screen to the physical screen in VRAM.

The landscape in current_state may be altered by the lix skills during calc()the logic calculation. This happens at a different time than draw(). Important is that the landscape will not necessarily be the same image upon the next call to draw(), I don't think about it as a sprite.

-- Simon
Title: Re: Porting Lix to SDL, loose ideas
Post by: ccexplore on April 02, 2015, 02:02:33 AM
Slightly off topic, but this together with your revival of the "end-user-translatable Lix" thread reminds me of this one thing, which is a good idea to take into account as you start doing your D port:

Currently the dimensions of UI elements (popup windows, buttons, etc.) in C+ Lix are all hardcoded.  While I never pretended to try to support complex languages like Chinese/Japanese/Korean, I can easily see one problem:  the current dimensions look to be a little too small to make those languages legible I think.  I think the font size being used would need to be bumped up a little to get good legibility for those languages, but then they probably won't fit properly within the existing hard-code heights of most buttons and labels.

I would suggest that for the D port, at a minimum, maybe introduce a global height and a global width scaling factor that can be applied to most UI, and incorporate those scaling factors into all calculations of sizes and positions of such UI.  And allow for those scaling values to be adjustable by translators (along with font sizes, but they can do that already even in the C++ port by simply replacing the game's font files).  I believe that will adequately help address the issue without introducing too much complexity.  Just an idea.
Title: Re: Porting Lix to SDL, loose ideas
Post by: Simon on April 02, 2015, 02:26:16 AM
A good thing. I don't yet have any code that depends on these design ideas, but they have come to my mind, too.

Variable resolution will go in. A5 offers something they call "fullscreen window", which is a window without edges or title bar, the size of the screen. It looks like fullscreen and alt-tabbing from that is very fast. I really want to use this, and I'm restricted to the user's desktop resolution for it.

Normal windowed mode will also go in. Only switching between window and fullscreen-window will be impossible, or slow and hard to implement, with the VRAM bitmaps.

A5 depends on FreeType, which loads TTF fonts in any wanted size. Example image. (http://asdfasdf.ethz.ch/~simon/etc/lix-with-d-2015-04-02.png)

I want the GUI to be scalable without my ugly pixelwise scaling blit in C++/A4 Lix. Your scaling factor right in the GUI code is a good idea: the GUI elements take coordinates before scaling, and draw after applying scaling. It doesn't account for widescreen displays, but I'm sure the idea can be extended in a nice way.

-- Simon
Title: Re: Porting Lix to SDL, loose ideas
Post by: vanfanel on February 08, 2016, 02:40:27 AM
Hi, Simon!

Any chances to see Lix ported to SDL2? That would take Lix to the mighty Raspberry Pi, for example, where allegro does not perform well at all while SDL2 has accelerated graphics support.
Lix on a 5$ computer could be great!
Title: Re: Porting Lix to SDL, loose ideas
Post by: namida on February 08, 2016, 02:52:53 AM
I attempted to compile and run Lix on Ubuntu on an Odroid-XU4, which is somewhat similar to (though much more powerful than) the Raspberry Pi. Although it worked at full speed, there were problems with sound and the Cuber skill - though this is based on the C++ version; I never tested the D/A5 version. (Note to self and Simon: Let's test that sometime too!) So I'm not sure how well it would work on ARM-based hardware in general.
Title: Re: Porting Lix to SDL, loose ideas
Post by: Simon on February 08, 2016, 09:02:58 AM
Raspberry Pi, for example, where allegro does not perform well

Allegro 4 or Allegro 5? I'd be surprised if A5 were slow on the Raspberry. A5 and SDL2 should have comparable hardware graphics acceleration. Some forum members have criticized using A5 over SDL2 because they were more familar with SDL2.

If you're feeling adventurous, build the D/A5 version that's yet in development (https://github.com/SimonN/LixD).

-- Simon
Title: Re: Porting Lix to SDL, loose ideas
Post by: vanfanel on February 09, 2016, 10:46:27 AM
Raspberry Pi, for example, where allegro does not perform well

Allegro 4 or Allegro 5? I'd be surprised if A5 were slow on the Raspberry. A5 and SDL2 should have comparable hardware graphics acceleration. Some forum members have criticized using A5 over SDL2 because they were more familar with SDL2.

If you're feeling adventurous, build the D/A5 version that's yet in development (https://github.com/SimonN/LixD).

-- Simon

Ah, Allegro 5 seems to have GLES support, yes. But it's implemented in a way that rendering context depends on X11 to work :(
So still Allegro 5 is not for me. I could add a dispmanx (native Rpi rendering context) but I have no time to add it to yet another library.
No word on the SDL2 port then? I understand you had something working already
Title: Re: Porting Lix to SDL, loose ideas
Post by: Simon on February 09, 2016, 11:01:57 AM
No SDL2, at least in the upcoming year. I'm too busy, too.

For a few weeks now, Lix regulars have been looking at the port, giving lots of feedback. I'm ironing out heavy problems, to get speed and better consistency with C++/A4 Lix. I wouldn't like to swap libraries at all right now.

Your case is the first substantial technical drawback of A5 compared to SDL2. Thanks for pointing it out. I view low-level graphics drawing as a black box. Did SDL2 have a similar problem, and you have written code for SDL2 to improve the situation?

-- Simon