Draggable UIImageView Partially Transparent & Irre

2019-04-02 17:33发布

@interface UIDraggableImageView : UIImageView {

}

.m file

- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
// Retrieve the touch point
CGPoint point = [[touches anyObject] locationInView:self];
startLocation = point;
[[self superview] bringSubviewToFront:self];
}

 - (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
// Move relative to the original touch point
CGPoint point = [[touches anyObject] locationInView:self];
CGRect frame = [self frame];
frame.origin.x += point.x - startLocation.x;
frame.origin.y += point.y - startLocation.y;
[self setFrame:frame];
}

took this code off the web for the draggable image,

Problem: image is irregular shape with transparent areas, clicking on transparent areas drags it as well.

Required Solution: How to make the transparent areas non interactive/non draggable?

Any suggestions, I will be trying to mask the image as an attempt and will post the results, but any workarounds/advise.

Further to suggestions by MiRAGe: Trying to incorporate the code in one class file, since image property is available in UIImageView and it would be easier to plug and play with any UIImageView in interface builder, but still having problems, transparent areas are movable, hitTest method gets called several times on a single click, any advise?

#import "UIImageViewDraggable.h"

@implementation UIImageViewDraggable

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// Retrieve the touch point
CGPoint point = [[touches anyObject] locationInView:self];
startLocation = point;
[[self superview] bringSubviewToFront:self];
}

- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
// Move relative to the original touch point
CGPoint point = [[touches anyObject] locationInView:self];
CGRect frame = [self frame];
frame.origin.x += point.x - startLocation.x;
frame.origin.y += point.y - startLocation.y;
[self setFrame:frame];
}

- (NSData *)alphaData {
CGContextRef    cgctx = NULL;
void *          bitmapData;
int             bitmapByteCount;

size_t pixelsWide = CGImageGetWidth(self.image.CGImage);
size_t pixelsHigh = CGImageGetHeight(self.image.CGImage);

bitmapByteCount     = (pixelsWide * pixelsHigh);

bitmapData = malloc( bitmapByteCount );
if (bitmapData == NULL) 
    return nil;

cgctx = CGBitmapContextCreate (bitmapData,
                               pixelsWide,
                               pixelsHigh,
                               8,
                               pixelsWide,
                               NULL,
                               kCGImageAlphaOnly);
if (cgctx == NULL) {
    free (bitmapData);
    fprintf (stderr, "Context not created!");

    return nil;
}

CGRect rect = {{0,0},{pixelsWide,pixelsHigh}}; 
CGContextDrawImage(cgctx, rect, self.image.CGImage); 

unsigned char *data = CGBitmapContextGetData(cgctx);

CGContextRelease(cgctx);

if (!data) {
    free(bitmapData);
    return nil;
}

size_t dataSize = pixelsWide * pixelsHigh;

NSData *alphaData = [NSData dataWithBytes:data length:dataSize];

free(bitmapData);
return alphaData;
}    

- (BOOL)isTransparentLocation:(CGPoint)point withData:(NSData *)data {   
if (data == nil)
    NSLog(@"data was nil");

NSUInteger index = point.x + (point.y * [self.image size].width);
unsigned char *rawDataBytes = (unsigned char *)[data bytes];

return (rawDataBytes[index] == 0);
}

- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
NSLog(@"test");
NSAutoreleasePool *pool = [NSAutoreleasePool new];

// view responding to the hit test. note that self may respond too.
UIView *anyViewResponding = [super hitTest:point withEvent:event];  
if( anyViewResponding == nil || anyViewResponding == self ) {
    // convert the point in the image, to a global point.
    CGPoint framePoint = [self.superview convertPoint:point fromView:self];
    // if the point is in the image frame, and there is an image, see if we need to let the touch through or not
    if(self.image != nil && CGRectContainsPoint([self frame], framePoint)) {
        NSData *imageData = [self alphaData];         

        // check if the point touched is transparent in the image
        if( imageData != nil && [self isTransparentLocation:point withData:imageData]) {               
            // return nil, so the touch will not arrive at this view
            anyViewResponding = nil;
        }
    }
}

[pool drain];
return anyViewResponding;
}

2条回答
在下西门庆
2楼-- · 2019-04-02 17:38

You can perform hit detection to determine if the CGPoint that represents the tap gesture lies within a shape that is defined by a CGPath.

- (id)initWithFrame:(CGRect)frame {
    ...
    CGPathRef outline = CGPathCreateMutable();
    CGPathMoveToPoint(outline, NULL, 20, 20);
    // Build up path
    ...
}

- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
    CGPoint point = [[touches anyObject] locationInView:self];
    if (CGPathContainsPoint(outline, NULL, point, false) {
        ...
        dragIsRespected = YES;
    }
}

- (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
    if (dragIsRespected) {
        ...
    }
}

- (void)dealloc {
    CGPathRelease(outline);
    ...
}

The bad thing is that building up the path for complex shapes is tedious. The good thing is that since you're dealing with finger taps, coarse outlines should probably suffice. Plus, this allows your touch target to diverge from the the opaque parts of the image in case you need some additional space to help with usability. It also allows for transparency within the touch target, thus allowing you to have more complex images if necessary.

You can get the desired boundary points pretty easily if you just map them out in an image editor.

查看更多
forever°为你锁心
3楼-- · 2019-04-02 17:53

You can easily detect alpha area's and make them non draggable. Here is some code that will allow you to detect alpha area's. It might be some overhead for you, but it's the best I could do.

I have subclassed UIImage and put this code in the implementation file.

#import <CoreGraphics/CoreGraphics.h>

- (NSData *)alphaData
{
    CGContextRef    cgctx = NULL;
    void *          bitmapData;
    int             bitmapByteCount;

    size_t pixelsWide = CGImageGetWidth(self.CGImage);
    size_t pixelsHigh = CGImageGetHeight(self.CGImage);

    bitmapByteCount     = (pixelsWide * pixelsHigh);

    bitmapData = malloc( bitmapByteCount );
    if (bitmapData == NULL) 
        return nil;

    cgctx = CGBitmapContextCreate (bitmapData,
                                   pixelsWide,
                                   pixelsHigh,
                                   8,
                                   pixelsWide,
                                   NULL,
                                   kCGImageAlphaOnly);
    if (cgctx == NULL)
    {
        free (bitmapData);
        fprintf (stderr, "Context not created!");

        return nil;
    }

    CGRect rect = {{0,0},{pixelsWide,pixelsHigh}}; 
    CGContextDrawImage(cgctx, rect, self.CGImage); 

    unsigned char *data = CGBitmapContextGetData(cgctx);

    CGContextRelease(cgctx);

    if (!data)
    {
        free(bitmapData);
        return nil;
    }

    size_t dataSize = pixelsWide * pixelsHigh;

    NSData *alphaData = [NSData dataWithBytes:data length:dataSize];

    free(bitmapData);
    return alphaData;
}    

- (BOOL)isTransparentLocation:(CGPoint)point withData:(NSData *)data
{   
    if (data == nil)
        NSLog(@"data was nil");

    NSUInteger index = point.x + (point.y * [self size].width);
    unsigned char *rawDataBytes = (unsigned char *)[data bytes];

    return (rawDataBytes[index] == 0);
}

Now in a subclass of UIImageView (I use the hitTest function to allow detection, but you could easily change this into something that works for you, this is just an example) I put this code to detect if the point hit was transparent or not. If it is transparent, we pass the touch onto the view below, otherwise we keep the touch to ourselves.

- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
    NSAutoreleasePool *pool = [NSAutoreleasePool new];

    // view responding to the hit test. note that self may respond too.
    UIView *anyViewResponding = [super hitTest:point withEvent:event];  
    if( anyViewResponding == nil || anyViewResponding == self )
    {
        // convert the point in the image, to a global point.
        CGPoint framePoint = [self.superview convertPoint:point fromView:self];
        // if the point is in the image frame, and there is an image, see if we need to let the touch through or not
        if( self.image != nil && CGRectContainsPoint([self frame], framePoint) )
        {
            NSData *imageData = [self.image alphaData];         

            // check if the point touched is transparent in the image
            if( imageData != nil && [self.image isTransparentLocation:point imageData] )
            {               
                // return nil, so the touch will not arrive at this view
                anyViewResponding = nil;
            }
        }
    }

    [pool drain];
    return anyViewResponding;
}
查看更多
登录 后发表回答