日本搞逼视频_黄色一级片免费在线观看_色99久久_性明星video另类hd_欧美77_综合在线视频

國內(nèi)最全I(xiàn)T社區(qū)平臺 聯(lián)系我們 | 收藏本站
阿里云優(yōu)惠2
您當(dāng)前位置:首頁 > 互聯(lián)網(wǎng) > 許鵬:以Spark為例淺談源碼跟讀實(shí)踐

許鵬:以Spark為例淺談源碼跟讀實(shí)踐

來源:程序員人生   發(fā)布時間:2014-09-16 08:58:42 閱讀次數(shù):2990次

【編者按】在 對許鵬的采訪中,我們有從方法上進(jìn)行了大型開源項(xiàng)目的學(xué)習(xí),其中包括Problem domain→model→architecture&implementation→improvement→best practice的思維范式,而本次許鵬的博文則更關(guān)注源碼跟讀過程中消息或調(diào)用的流程追蹤。


免費(fèi)訂閱“CSDN云計(jì)算”微信公眾號,實(shí)時掌握第一手云中消息!

CSDN作為國內(nèi)最專業(yè)的云計(jì)算服務(wù)平臺,提供云計(jì)算、大數(shù)據(jù)、虛擬化、數(shù)據(jù)中心、OpenStack、CloudStack、Hadoop、Spark、機(jī)器學(xué)習(xí)、智能算法等相關(guān)云計(jì)算觀點(diǎn),云計(jì)算技術(shù),云計(jì)算平臺,云計(jì)算實(shí)踐,云計(jì)算產(chǎn)業(yè)資訊等服務(wù)。


下為原文

概要

本次不談Spark中什么復(fù)雜的技術(shù)實(shí)現(xiàn),只稍為聊聊如何進(jìn)行代碼跟讀。眾所周知,Spark使用Scala進(jìn)行開發(fā),由于Scala有眾多的語法糖,很多時候代碼跟著跟著就覺著線索跟丟掉了,另外Spark基于Akka來進(jìn)行消息交互,那如何知道誰是接收方呢?

new Throwable().printStackTrace

代碼跟讀的時候,經(jīng)常會借助于日志,針對日志中輸出的每一句,我們都很想知道它們的調(diào)用者是誰。但有時苦于對Spark系統(tǒng)的了解程度不深,或者對Scala認(rèn)識不夠,一時半會之內(nèi)無法找到答案,那么有沒有什么簡便的辦法呢?我的辦法就是在日志出現(xiàn)的地方加入下面一句話:

new Throwable().printStackTrace()

現(xiàn)在舉一個實(shí)際的例子來說明問題。比如我們在啟動spark-shell之后,輸入一句非常簡單的sc.textFile("README.md"),會輸出下述的log:

14/07/05 19:53:27 INFO MemoryStore: ensureFreeSpace(32816) called with
    curMem=0, maxMem=308910489 14/07/05 19:53:27 INFO MemoryStore: Block broadcast_0
    stored as values in memory (estimated size 32.0 KB, free 294.6 MB) 14/07/05
    19:53:27 DEBUG BlockManager: Put block broadcast_0 locally took 78 ms 14/07/05
    19:53:27 DEBUG BlockManager: Putting block broadcast_0 without replication
    took 79 ms res0: org.apache.spark.rdd.RDD[String] = README.md MappedRDD[1]
    at textFile at :13

那我很想知道是第二句日志所在的tryToPut函數(shù)是被誰調(diào)用的該怎么辦?辦法就是打開MemoryStore.scala,找到下述語句:

logInfo("Block %s stored as %s in memory (estimated size %s, free %s)".format(
          blockId, valuesOrBytes, Utils.bytesToString(size), Utils.bytesToString(freeMemory)))
在這句話之上,添加如下語句
new Throwable().printStackTrace()

然后,重新進(jìn)行源碼編譯

sbt/sbt assembly

再次打開spark-shell,執(zhí)行sc.textFile("README.md"),就可以得到如下輸出,從中可以清楚知道tryToPut的調(diào)用者是誰

14/07/05 19:53:27 INFO MemoryStore: ensureFreeSpace(32816) called with curMem=0, maxMem=308910489
14/07/05 19:53:27 WARN MemoryStore: just show the calltrace by entering some modified code
java.lang.Throwable
  at org.apache.spark.storage.MemoryStore.tryToPut(MemoryStore.scala:182)
  at org.apache.spark.storage.MemoryStore.putValues(MemoryStore.scala:76)
  at org.apache.spark.storage.MemoryStore.putValues(MemoryStore.scala:92)
  at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:699)
  at org.apache.spark.storage.BlockManager.put(BlockManager.scala:570)
  at org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:821)
  at org.apache.spark.broadcast.HttpBroadcast.(HttpBroadcast.scala:52)
  at org.apache.spark.broadcast.HttpBroadcastFactory.newBroadcast(HttpBroadcastFactory.scala:35)
  at org.apache.spark.broadcast.HttpBroadcastFactory.newBroadcast(HttpBroadcastFactory.scala:29)
  at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
  at org.apache.spark.SparkContext.broadcast(SparkContext.scala:787)
  at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:556)
  at org.apache.spark.SparkContext.textFile(SparkContext.scala:468)
  at $line5.$read$iwC$iwC$iwC$iwC.(:13)
  at $line5.$read$iwC$iwC$iwC.(:18)
  at $line5.$read$iwC$iwC.(:20)
  at $line5.$read$iwC.(:22)
  at $line5.$read.(:24)
  at $line5.$read$.(:28)
  at $line5.$read$.()
  at $line5.$eval$.(:7)
  at $line5.$eval$.()
  at $line5.$eval.$print()
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:483)
  at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
  at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
  at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
  at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
  at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
  at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
  at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
  at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
  at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:601)
  at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:608)
  at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:611)
  at org.apache.spark.repl.SparkILoop$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:936)
  at org.apache.spark.repl.SparkILoop$anonfun$process$1.apply(SparkILoop.scala:884)
  at org.apache.spark.repl.SparkILoop$anonfun$process$1.apply(SparkILoop.scala:884)
  at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
  at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
  at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
  at org.apache.spark.repl.Main$.main(Main.scala:31)
  at org.apache.spark.repl.Main.main(Main.scala)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:483)
  at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
14/07/05 19:53:27 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 32.0 KB, free 294.6 MB)
14/07/05 19:53:27 DEBUG BlockManager: Put block broadcast_0 locally took  78 ms
14/07/05 19:53:27 DEBUG BlockManager: Putting block broadcast_0 without replication took  79 ms
res0: org.apache.spark.rdd.RDD[String] = README.md MappedRDD[1] at textFile at :13

git同步

對代碼作了修改之后,如果并不想提交代碼,那該如何將最新的內(nèi)容同步到本地呢?

git reset --hard
git pull origin master

Akka消息跟蹤

追蹤消息的接收者是誰,相對來說比較容易,只要使用好grep就可以了,當(dāng)然前提是要對actor model有一點(diǎn)點(diǎn)了解。

還是舉個實(shí)例吧,我們知道CoarseGrainedSchedulerBackend會發(fā)送LaunchTask消息出來,那么誰是接收方呢?只需要執(zhí)行以下腳本即可。

grep LaunchTask -r core/src/main
從如下的輸出中,可以清楚看出CoarseGrainedExecutorBackend是LaunchTask的接收方,接收到該函數(shù)之后的業(yè)務(wù)處理,只需要去看看接收方的receive函數(shù)即可。
core/src/main/scala/org/apache/spark/executor/CoarseGrainedExecutorBackend.scala: case LaunchTask(data) =>
core/src/main/scala/org/apache/spark/executor/CoarseGrainedExecutorBackend.scala: logError("Received LaunchTask command but executor was null")
core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedClusterMessage.scala: case class LaunchTask(data: SerializableBuffer) extends CoarseGrainedClusterMessage
core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedSchedulerBackend.scala:  
executorActor(task.executorId) ! LaunchTask(new SerializableBuffer(serializedTask))

原文鏈接: Apache Spark源碼走讀之17 -- 如何進(jìn)行代碼跟讀(責(zé)編/仲浩)

生活不易,碼農(nóng)辛苦
如果您覺得本網(wǎng)站對您的學(xué)習(xí)有所幫助,可以手機(jī)掃描二維碼進(jìn)行捐贈
程序員人生
------分隔線----------------------------
分享到:
------分隔線----------------------------
關(guān)閉
程序員人生
主站蜘蛛池模板: 97久久久久久久久 | 成人久久 | 欧美高清视频在线观看 | 美女视频网站久久 | 国产精品一区二区久久 | 午夜视频在线免费观看 | 久久免费精彩视频 | 久久久www成人免费无遮挡大片 | 美女又爽又黄视频毛茸茸 | 国产精品久久久久久久久久东京 | 一区在线视频 | 成人在线免费看 | 久久久精品在线 | 国产精品久久久久久久三级 | 精品国产青草久久久久福利 | 91 中文字幕| 国产精品国产馆在线真实露脸 | 激情欧美一区二区三区 | 久久密 | 亚洲欧美国产一区二区三区 | 国产不卡在线视频 | 欧美精品在线视频 | 免费一级毛片在线观看 | 中字一区 | 美女国内精品自产拍在线播放 | 精品伊人| 国产成人精品久久二区二区 | 精品国产91久久久久久 | 久久99精品久久久久久久久久久久 | 精品一区二区不卡 | 国产高清精品一区二区三区 | 一级毛片一级毛片 | 成人免费av | 亚洲精品99| www.夜夜骑.com| 精品久久久国产 | 成人高潮片免费视频 | 精品国产免费久久久久久尖叫 | 亚洲精品高清在线 | 日韩欧美电影在线观看 | 夜夜福利 |