Spark 관련 환경 문제 및 해결 방법

3561 단어
장면
Spark 개발 과정에서 발생한 환경 문제 및 해결 방법 문서화
필기
  • 문제1: 기계의 물리적 메모리 부족
  • 현상: #을 통해./start-all.sh Spark 시작 시 예외가 발생했습니다. 프롬프트에 따라 관련 log # cat spark -hadoop-org를 참조하십시오.apache.spark.deploy.master.Master-1-master.out, 발견: # There is insufficient memory for the Java Runtime Environment to continue. #Native memory allocation (mmap) failed to map 715849728 bytes for committing reserved memory. # Possible reasons: #   The system is out of physical RAM or swap space #   In 32 bit mode, the process size limit was hit # Possible solutions: #   Reduce memory load on the system #   Increase physical memory or swap space #   Check if swap backing store is full #   Use 64 bit Java on a 64 bit OS #   Decrease Java heap size (-Xmx/-Xms) #   Decrease number of Java threads #   Decrease Java thread stack sizes (-Xss) #   Set larger code cache with -XX:ReservedCodeCacheSize= # This output file may be truncated or incomplete. # #  Out of Memory Error (os_linux.cpp:2627), pid=22008, tid=139666963719936 # # JRE version:  (8.0_60-b27) (build ) # Java VM: Java HotSpot(TM) 64-Bit Server VM (25.60-b23 mixed mode linux-amd64 compressed oops) # Failed to write core dump. Core dumps have been disabled. To enable core dumping,try "ulimit-c unlimited"before starting Java again # 해결 방법:free-m 사용 가능한 메모리 부족 발견 => 아리운 업그레이드 1핵 2G = > 80 대양 맵지 않음
  • 문제2: JVM 메모리 부족
  • Scala 프로그램을 sbt로 패키지화할 때 나타나는 현상:hadoop@master:~/sparkapp$/usr/local/sbt/sbt package Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256M; support was removed in 8.0 [info] Set current project to Simple Project (in build file:/home/hadoop/sparkapp/) [info] Updating {file:/home/hadoop/sparkapp/}sparkapp... [info] Resolving org.scala-lang#scala-library;2.10.4 ... [info] Updating {file:/home/hadoop/sparkapp/}sparkapp... [info] Resolving org.scala-lang#scala-library;2.10.4 ... java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: unable to create new native thread     at java.util.concurrent.FutureTask.report(FutureTask.java:122)     at java.util.concurrent.FutureTask.get(FutureTask.java:192)     at sbt.ConcurrentRestrictions$$anon$4.take(ConcurrentRestrictions.scala:188)     at sbt.Execute.next$1(Execute.scala:83)     at sbt.Execute.processAll(Execute.scala:86) 해결책: JVM의 사용 가능한 메모리 증가 #set JAVAOPTS=-Xms512m -Xmx1024m
  • 문제3:spark와 scala버전이 호환되지 않는 문제
  • 현상: 아이디어를 사용하여 WordCount 프로그램을 실행할 때 다음과 같이 나타납니다.
    Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
    at akka.actor.ActorCell$.<init>(ActorCell.scala:336)
    at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
    at akka.actor.RootActorPath.$div(ActorPath.scala:159)
    at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:464)
    at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:452)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    해결 방법: scala.2.11.8과spark1.6.0 호환성 문제가 있습니다.scala 환경을 scala2로 변경합니다.10.4 이후 WordCount 프로그램 실행
    총결산
    yes, I love problems !  I am a problem-solver !

    좋은 웹페이지 즐겨찾기