C array memory deallocation from swift

2019-07-03 11:02发布

问题:

var cpuInfo: processor_info_array_t = nil
var numCpuInfo: mach_msg_type_number_t = 0
var coresTotalUsage: Float = 0.0
var numCPUsU: natural_t = 0
let err = host_processor_info(mach_host_self(), PROCESSOR_CPU_LOAD_INFO, &numCPUsU, &cpuInfo, &numCpuInfo);
assert(err == KERN_SUCCESS, "Failed call to host_processor_info")

Hi, I am calling the above C API host_processor_info to get process load informations from swift, no problem there. cpuInfo is a inout parameter (pointer) that, on return, will point to a structure containing the CPU information allocated by that API. The caller is reponsible for deallocating the memory; I can do that easily from objective C but haven't had any luck in swift. I know I could wrap that call into an objective C extension but I'm trying to learn swift and would like, if possible, avoid the obj-c solution.

in obj-c I would deallocate with:

size_t cpuInfoSize = sizeof(integer_t) * numCpuInfo;
vm_deallocate(mach_task_self(), (vm_address_t) cpuInfo, cpuInfoSize)

cpuInfo in swift is an UnsafeMutablePointer not convertible into a vm_address_t.

Any help appreciated, thanks.

回答1:

processor_info_array_t is a pointer type, and vm_address_t is an integer type (ultimately an alias for UInt). (Judging from the comments in <i386/vm_types.h> this might to be for historical reasons.) The only way to convert a pointer to an integer (of the same size) in Swift is unsafeBitCast.

mach_init.h defines

extern mach_port_t      mach_task_self_;
#define mach_task_self() mach_task_self_

Only the extern variable is visible in Swift, not the macro.

This gives:

let cpuInfoSize = vm_size_t(sizeof(integer_t)) * vm_size_t(numCpuInfo)
vm_deallocate(mach_task_self_, unsafeBitCast(cpuInfo, vm_address_t.self), cpuInfoSize)


回答2:

In Swift 4, the equivalent code appears to be:

let cpuInfoSize = vm_size_t(MemoryLayout<integer_t>.stride * Int(numCpuInfo))
vm_deallocate(mach_task_self_, vm_address_t(bitPattern: cpuInfo), cpuInfoSize)

In particular, the initializer UInt(bitPattern:) is now apparently preferred to unsafeBitCast() to initialize an unsigned integer with the bit pattern of a pointer (I guess this usage is not longer considered "unsafe"). It correctly handles a nil pointer, returning 0 in this case.