I'm trying to make a Mac broadcast client to encode into H264 using FFmpeg but not x264 library.
So basically, I am able to get raw frames out from AVFoundation in either CMSampleBufferRef
or AVPicture
. So is there a way to encode a series of those pictures into H264 frames using Apple framework, like AVVideoCodecH264
.
I know the way to encode it use AVAssetWriter, but that only saves the video into file, but I don't want the file, instead, I'd want to have AVPacket so I can send out using FFmpeg. Does anyone have any idea? Thank you.
相关问题
- Xcode debugger displays incorrect values for varia
- Is there a way to report errors in Apple documenta
- Advice for supporting both Mac and Windows Desktop
- AVAssetExportSession fails on IOS 13, muxing toget
- Avoid cmake to add the flags -search_paths_first a
相关文章
- 现在使用swift开发ios应用好还是swift?
- Visual Studio Code, MAC OS X, OmniSharp server is
- Handling ffmpeg library interface change when upgr
- How to use a framework build of Python with Anacon
- xcode 4 garbage collection removed?
- IntelliJ IDEA can't open projects or add SDK o
- Automator: How do I use the Choose from List actio
- ImportError: No module named twisted.persisted.sty
After refer to VideoCore project, I'm able to use Apple's VideoToolbox framework to encode in hardware.
Start an VTCompressionSession:
push the raw frame to the VTCompressionSession
Get the encoded frame in the VTCallback, this is a C method to be used as a parameter of VTCompressionSessionCreate()