This is my first question ever on StackOverflow, hurray! I can honestly say I use StackOverflow on daily basis for both my work and personal programming mysteries. 99,9% of the time I actually find the answer I need on here too, which is great!
My current problem actually stumped me a little as I can't seem to find anything which actually works. I've already read several posts on GameDev.net and found other resources around the net but can't sort it out.
I am in the process of porting a small 2D engine I wrote for XNA to SlimDX (just DirectX9 at the moment), which has been a good move as I learned more about inner-workings of DirectX in just a few days than I did in six months of working with XNA. I got most of my basic rendering features done and actually managed to recreate the XNA SpriteBatch with a ton of additional features (which I really missed in XNA).
One of the last things I'm trying to get to work is to extract a source-rectangle from a given texture and use it for tiling. Why: When not tiling you can just mess around with the UV to get the source you want to display (eg: 0.3;0.3 to 0.5;0.5), but when tiling you need the UV to tile (0;0 to 2;2 means tile images twice) and therefore need a cutout texture.
To make a long story short, I try to use the following:
DataRectangle dataRectangle = sprite.Texture.LockRectangle(0, LockFlags.None);
Format format = sprite.Texture.GetLevelDescription(0).Format;
byte[] buffer = new byte[4];
dataRectangle.Data.Read(buffer, ([y] * dataRectangle.Pitch) + ([x] * 4), buffer.Length);
texture.UnlockRectangle(0);
I tried different pixels but all seem to give bogus data. For example, I actually tried using my current avatar to see if the buffer I got from the DataRectangle matches the actual pixel in the image, but no luck (even checked if the Format was correct, which it is).
What am I doing wrong? Is there a better way to do it? Or is my UV story wrong and can it be solved much simpler than cutting out a source-rectangle before tiling it?
Thank you for your time,
Lennard Fonteijn
Update #1
I actually managed to export the pixel data to a Bitmap using the following conversion from a byte array:
int pixel = (buffer[0] & 0xFF) | ((buffer[1] & 0xFF) << 8) | ((buffer[2] & 0xFF) << 16) | ((255 - buffer[3] & 0xFF) << 24);
So the data doesn't seem so bogus as I thought it was. My next problem, however, is grabbing the pixels specified in the source rectangle and copy them to a new texture. The image I'm trying to cut is 150x150, but for some reason it is stretched to a 256x256 image (power of two), but when actually trying to access pixels beyond 150x150, it throws an OutOfBounds exception. Also, when I actually try to create a second blank texture of size 256x256, no matter what I put into it, it turns out completely black.
Here's my current code:
//Texture texture = 150x150
DataRectangle dataRectangle = texture.LockRectangle(0, LockFlags.None);
SurfaceDescription surface = texture.GetLevelDescription(0);
Texture texture2 = new Texture(_graphicsDevice, surface.Width, surface.Height, 0, surface.Usage, surface.Format, surface.Pool);
DataRectangle dataRectangle2 = texture2.LockRectangle(0, LockFlags.None);
for (int k = sourceX; k < sourceHeight; k++)
{
for (int l = sourceY; l < sourceWidth; l++)
{
byte[] buffer = new byte[4];
dataRectangle.Data.Seek((k * dataRectangle.Pitch) + (l* 4), SeekOrigin.Begin);
dataRectangle.Data.Read(buffer, 0, 4);
dataRectangle2.Data.Seek(((k - sourceY) * dataRectangle2.Pitch) + ((l - sourceX) * 4), SeekOrigin.Begin);
dataRectangle2.Data.Write(buffer, 0, 4);
}
}
sprite.Texture.UnlockRectangle(0);
texture2.UnlockRectangle(0);
_graphicsDevice.SetTexture(0, texture2);
So my new (additional) questions are: How can I move over pixels from one texture to another smaller texture, including the Alpha channel? Any why does the SurfaceDescription report 256x256 when my original texture is 150x150?