您好,登錄后才能下訂單哦!
測(cè)試思路:
首先,創(chuàng)建網(wǎng)絡(luò)數(shù)據(jù)源數(shù)據(jù)發(fā)送器(程序一);
其次,創(chuàng)建spark接收數(shù)據(jù)程序(程序二);
接著,將程序一打包,放在服務(wù)器上執(zhí)行。這里有三個(gè)參數(shù)分別是:所要發(fā)送的數(shù)據(jù)文件,通過(guò)哪個(gè)端口號(hào)發(fā)送,每隔多少毫秒發(fā)送一次數(shù)據(jù);
最后,運(yùn)行spark程序,這里每隔5秒處理一次數(shù)據(jù)。有兩個(gè)參數(shù):監(jiān)聽(tīng)的端口號(hào),每隔多少毫秒接收一次數(shù)據(jù)。
觀(guān)察效果。
程序一:
sparkStreaming import java.io.PrintWriter import java.net.ServerSocket import scala.io.Source object SalaSimulation { (length: ) = { java.util.Random rdm = Random rdm.nextInt(length) } (args: Array[]){ (args.length != ){ System..println() System.() } filename = args() lines = Source.(filename).getLines.toList filerow = lines.length listener = ServerSocket(args().toInt) (){ socket = listener.accept() Thread(){ = { (+socket.getInetAddress) out = PrintWriter(socket.getOutputStream()) (){ Thread.(args().toLong) content = lines((filerow)) (content) out.write(content +) out.flush() } socket.close() } }.start() } } }
程序二:
sparkStreaming import org.apache.log4j.{LoggerLevel} import org.apache.spark.storage.StorageLevel import org.apache.spark.streaming.{SecondsStreamingContext} import org.apache.spark.{SparkContextSparkConf} import org.apache.spark.streaming.StreamingContext._ object NetworkWordCount { def main(args: Array[]){ Logger.getLogger("org.apache.spark").setLevel(Level.WARN) Logger.getLogger("org.apache.eclipse.jetty.server").setLevel(Level.OFF) conf = SparkConf().setAppName().setMaster() sc = SparkContext(conf) ssc = StreamingContext(sc()) lines = ssc.socketTextStream(args()args().toIntStorageLevel.) words = lines.flatMap(_.split()) wordCounts = words.map(x=>(x)).reduceByKey(_+_) wordCounts.print() ssc.start() ssc.awaitTermination() } }
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀(guān)點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長(zhǎng)郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。