« Last post by Juju on March 15, 2018, 09:13:23 am »
As I mentioned earlier on Discord, the 286 is a 16-bit CPU and as such can't address more than 2^16 bytes (64 KiB) of memory at once, so you can't just dump the entire image to the graphics adapter all at once. However, you can change the segment of memory the CPU can see so 2^24 bytes (16 MiB) is accessible. (Older Intel CPUs had a 20-bit address bus, though, starting from the 80286 they upgraded it to 24, but you have to enable the last 4 bits with the A20 line for compatibility with older software who expected the memory space wrapping around after 1 MiB.) In DOS, the first 640 KiB of it is directly mapped to the RAM, while the rest is memory-mapped I/O and you need a memory manager software to map the rest of the RAM there. (You can also access more than 16 MiB of RAM with some sort of bank switching, but it's starting to get complicated here.) (Read more about it: https://en.wikipedia.org/wiki/DOS_memory_management
A 1024x768x8 picture is 768 KiB, so theorically, you could fit it all into RAM, but you'd have to segment switch every 64 KiB, avoid the space that isn't mapped to RAM and use a memory manager. It's a big mess, really. And even then, unlike CGA, EGA, VGA and the like, which merely maps the screen somewhere into the memory-mapped I/O space so you can just copy your bytes on it as if it were RAM, the 8514 has a GPU you'd have to send commands to (which is likely why you need the AI layer), and I don't think you can send all your bytes at once to it either.
So I guess your best bet is probably streaming all your file reads directly to the graphics adapter, byte by byte, or chunk by chunk if you can. That's it, read a pixel, draw it and loop for every pixel. If it's a rectangle function, you might even want to use some sort of RLE as @Jarren Long
here described, as in, if you encounter say 16 pixels of the same color, draw a rectangle 16 pixels long. RLE decoding this way would be extremely simple.
« Last post by Jarren Long on March 15, 2018, 06:51:08 am »
If you don't want to split up your image, you might be able to get away with using some simple compression on the bitmap, like Run-Length Encoding (RLE). If your image has scanlines that have the same color repeated over multiple pixels, RLE could reduce your image size quite a bit, which would allow you to load the whole thing in memory. Example being that, if the first scanline of the top of your bitmap is all the same color (we'll say black, 0x00 for this example), you could RLE that entire line, and store it in 8 bytes instead of 1024 (4 bytes of actual color, and 4 bytes for the number of times to repeat that color). That would look like FF00FF00FF00FF00 (read it as "256 pixels of color 0x00, repeated 4 times), instead of 1024 00's. At that point, you would just need to update your code to parse the RLE pixels and reinflate the bitmap on the fly while you render it. Though, you're just sacrificing clock cycles to spare memory by doing that.
Extra credit: instead of encoding the image beforehand, write code that will RLE the bitmap as you read it in.
That would solve the image size issue at least, so long as RLE would be appropriate for your image. If you're trying to display something like a big color gradient, your S.O.L., RLE would actually make the image larger in size.
For drawing more than one pixel at a time with the library you have, you'll need to dig into the API to see how it accesses the video buffer to write out to the graphics area of memory, and then reproduce it yourself. The hardware specs for your device will probably have a breakdown of the memory locations somewhere. And assembly will probably be required. If you're willing to dig in that deep, you can probably just read the bitmap directly into the graphics memory area, and skip the arrays all together.
Now the big question: why on earth would you want to play with a 286?!?
« Last post by gameblabla on March 13, 2018, 10:06:14 am »
I actually bought Asterix & Obelix XXL a few years later for my GBA SP.
I found the graphics to be pretty nice. (though if you look harder, you can see they are merely using 2D sprites in a 3D environment).
However, the game was pretty repetitive. (and also pretty hard later on)
Plenty of other 3D games got cancelled as well : Dune Ornithoper Assault, Shi'en GBA racer, GP Advance (very impressive!)...
Shame that most of these games for the GBA turned out to be cancelled though.
« Last post by gameblabla on March 13, 2018, 09:39:29 am »
i was having some fun trying to display graphics in all kinds of different graphics mode for DOS.
So far, i've been able to try out CGA, EGA and (ofc) VGA, including the various Mode-X resolutions.
Just recently, i managed to find out how to support and write pixels for the IBM 8514. (I'll talk about it in another post)
However, i encountered some issues. The IBM 8514 is supposed to work with an IBM AT, which comes with a 80286 and that processor is 16-bits.
Making matters worse, arrays/segment cannot be bigger than 64k, even if you have enough memory.
(and even if you are using protected mode, as i found out later. Not to mention, its full of bugs)
Also, IBM never released hardware documentation for it. It only released documentation for AI, a software layer.
And that software layer is very unsuitable for pixel drawing and framebuffer access.
That adapter supports a resolution of 1024x768 with 256 colors. And the only thing i can do is to draw one pixel at a time.
The picture i want to display is that resolution and its like 768kb big.
You can see where this is going...
Here's the relevant C code for it.
typedef struct tagBITMAP /* the structure for a bitmap. */
void drawquad(unsigned long col, short x, short y, unsigned short w, unsigned short h)
static HRECT_DATA quad;
quad.coord.x_coord = x;
quad.coord.y_coord = y;
quad.width = w;
quad.height = h;
hscol_data.index = (long) col;
HSCOL(&hscol_data); /* set supplied colour */
HBAR(); /* begin area */
HRECT(&quad); /* draw quadrilateral */
HEAR(&hear_data); /* end area */
void draw_pict(BITMAP *bmp,int x,int y)
unsigned short i,j;
drawquad(bmp->data[i+(j*bmp->width)], i, j, 1, 1);
void load_bmp(const char *file,BITMAP *b)
/* open the file */
if ((fp = fopen(file,"rb")) == NULL)
printf("Error opening file %s.\n",file);
/* check to see if it is a valid bitmap file */
if (fgetc(fp)!='B' || fgetc(fp)!='M')
printf("%s is not a bitmap file.\n",file);
/* read in the width and height of the image, and the
number of colors used; ignore the rest */
fread(&b->width, sizeof(word), 1, fp);
fread(&b->height,sizeof(word), 1, fp);
fread(&num_colors,sizeof(word), 1, fp);
/* assume we are working with an 8-bit file */
if (num_colors==0) num_colors=256;
/* try to allocate memory */
if ((b->data = (byte *) malloc((word)(b->width*b->height))) == NULL)
printf("Error allocating memory for file %s.\n",file);
/* Ignore the palette information for now.
See palette.c for code to read the palette info. */
/* read the bitmap */
So with the 286's limitation of 64k for arrays, i am stuck.
Actually i thought of several solutions, none of which worked or are ideal :
- Separate the picture into several parts. This would work but it's a huge inconvenience.
- Call the 80286 a brain dead chip (thanks billy) and make it 32-bits only.
The problem is that AI comes with a small bit of assembly and i was unable to make it work in protected mode.
Plus, it wouldn't work on a stock IBM AT.
I looked at the only game for the 8514, Mah Jong -8514-,
and it is also using AI.
I played the game and i noticed its drawings graphics like it would do with vector graphics.
So yeah, not a good example.
So what did programmers do at the time ? And if you don't know, what would you suggest ?
And no please, i don't intend to spit each picture into several parts unless you tell me a good reason why i should do that.
Also, AI only allows you to draw a pixel at best. (Using the rectangle function)
They did fix that later with XGA but that function is not backward compatible with the 8158.
I’ve been waiting for something like this for a long time. I eve considered doing it myself. I do remember a while back that someone was asking about hooking a raspberry pi to that small screen..
« Last post by Juju on March 12, 2018, 08:12:38 pm »
Hm, that's interesting. Sounds like something to TAS.
« Last post by JWinslow23 on March 12, 2018, 07:45:26 pm »
I was playing through Takeshi no Chousenjou recently, and I couldn't get far, but I wanted to see the ending. Apparently, according to an internet rumor, punching 30,720 times at the title screen warps you to the final room of the game, which sounds outlandish. However, looking through the code of the game, it's entirely true!
Therefore, I had to make a Game Genie code related to this. My results are EGOYESOK
(both attempts at making "readable" codes), which do the same thing: punch once at the title screen to warp to the final room of the game.
The ending, of course, is underwhelming.
« Last post by xlibman on March 12, 2018, 07:11:57 pm »
in Secret of Mana?
« Last post by Juju on March 11, 2018, 08:21:33 pm »
Oooh, nice, thanks. We'll put them on when we'll have some time for that.
« Last post by DarkestEx on March 11, 2018, 04:27:02 pm »
Could these Facebook/Twitter Like buttons be replaced with non-tracking ones like these:https://sharingbuttons.io
Or possibly have them enableable with a slide switch?
I think others would appreciate this too.