如何使用Hadoop的MapReuce框架在OpenCL应用程序?(How to use hadoo

2019-08-17 17:27发布

I am developing an application in opencl whose basic objective is to implement a data mining algorithm on GPU platform. I want to use Hadoop Distributed File System and want to execute the application on multiple nodes. I am using MapReduce framework and I have divided my basic algorithm into two parts i.e. 'Map' and 'Reduce'.

I have never worked in hadoop before so I have some questions:

  1. Do I have write my application in java only to use Hadoop and Mapeduce framework?
  2. I have written kernel functions for map and reduce in opencl. Is it possible to use HDFS a file system for a non java GPU-Computing application? (Note: I don't want to use JavaCL or Aparapi)

Answer 1:

你可以使用Hadoop的数据流,有了它,你可以在你想,只要你的代码可以从标准输入输出读写回给它的任何语言编写映射器和减压器。 为了寻找灵感,你可以在实例如何R用于使用Hadoop流



Answer 2:

HDFS是一个文件系统; 您可以使用HDFS文件系统与任何语言。

HDFS数据分布在多台机器,它是高度可用于处理在GPU计算的数据。

欲了解更多信息参考Hadoop的数据流 。



文章来源: How to use hadoop MapReuce framework for an Opencl application?