I wanted to create simple iOS plugin which can draw the texture to unity Texture2D. I've done it by CreateExternalTexture() and UpdateExternalTexture(), it's working fine, but I'm curious if I can actually fill the Unity texture straight from iOS side. Here's my code of iOS plugin:
//
// testTexturePlugin.m
// Unity-iPhone
//
// Created by user on 18/01/16.
//
//
#import <OpenGLES/ES2/gl.h>
#import <OpenGLES/ES2/glext.h>
#import <UIKit/UIKit.h>
#include "UnityMetalSupport.h"
#include <stdlib.h>
#include <stdint.h>
static UIImage* LoadImage()
{
NSString* imageName = @"logo"; //[NSString stringWithUTF8String: filename];
NSString* imagePath = [[NSBundle mainBundle] pathForResource: imageName ofType: @"png"];
return [UIImage imageWithContentsOfFile: imagePath];
}
// you need to free this pointer
static void* LoadDataFromImage(UIImage* image)
{
CGImageRef imageData = image.CGImage;
unsigned imageW = CGImageGetWidth(imageData);
unsigned imageH = CGImageGetHeight(imageData);
// for the sake of the sample we enforce 128x128 textures
//assert(imageW == 128 && imageH == 128);
void* textureData = ::malloc(imageW * imageH * 4);
::memset(textureData, 0x00, imageW * imageH * 4);
CGContextRef textureContext = CGBitmapContextCreate(textureData, imageW, imageH, 8, imageW * 4, CGImageGetColorSpace(imageData), kCGImageAlphaPremultipliedLast);
CGContextSetBlendMode(textureContext, kCGBlendModeCopy);
CGContextDrawImage(textureContext, CGRectMake(0, 0, imageW, imageH), imageData);
CGContextRelease(textureContext);
return textureData;
}
static void CreateMetalTexture(uintptr_t texRef, void* data, unsigned w, unsigned h)
{
#if defined(__IPHONE_8_0) && !TARGET_IPHONE_SIMULATOR
NSLog(@"texRef iOS = %lu", texRef);
id<MTLTexture> tex = (id<MTLTexture>)(size_t)texRef;
MTLRegion r = MTLRegionMake3D(0, 0, 0, w, h, 1);
[tex replaceRegion: r mipmapLevel: 0 withBytes: data bytesPerRow: w * 4];
#else
#endif
}
extern "C" void FillUnityTexture(uintptr_t texRef)
{
UIImage* image = LoadImage();
void* textureData = LoadDataFromImage(image);
if (UnitySelectedRenderingAPI() == apiMetal)
CreateMetalTexture(texRef, textureData, image.size.width, image.size.height);
::free(textureData);
}
And here's the Unity code:
using UnityEngine;
using System;
using System.Collections;
using System.Runtime.InteropServices;
public class TextureHandler : MonoBehaviour {
[SerializeField]
private Renderer _mesh;
private Texture2D _meshTexture;
[DllImport("__Internal")]
private static extern void FillUnityTexture(IntPtr texRef);
void Start () {
_meshTexture = new Texture2D(200, 200, TextureFormat.ARGB32, false);
_mesh.material.SetTextureScale ("_MainTex", new Vector2 (-1, -1));
_mesh.material.mainTexture = _meshTexture;
IntPtr texPtr = _meshTexture.GetNativeTexturePtr();
Debug.Log("texPtr Unity = " + texPtr);
FillUnityTexture(texPtr);
}
}
Pointer on the Unity texture is passing to the iOS plugin correctly, I checked. But I have the crash in this line on iOS plugin:
[tex replaceRegion: r mipmapLevel: 0 withBytes: data bytesPerRow: w * 4];
and I'm pretty sure that I have this problem because of wrong converting of the pointer on Unity texture (uintptr_t) to metal texture (id).
So my question is - how can I convert pointer to the texture to MTLTexture properly?
I guess you should read about ARC.
You can use
__bridge_retained
to transfer the ownership of newly createdid<MTLTexture>
object touintptr_t
code. When you want to convert theuintptr_t
back toid<MTLTexture>
use__bridge
if you don't want to transfer the ownership back or use__bridge_transfer
when you do.