在我的本地系统中运行程序时出现错误
我的RAM大小为3GB,需要解决方案
Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216)
at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:432)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
at SparkCore.cartesianTransformation$.main(cartesianTransformation.scala:11)
at SparkCore.cartesianTransformation.main(cartesianTransformation.scala)
最佳答案
看来您的Spark驱动程序正在小内存中运行,请尝试增加驱动程序内存的大小。
您可以使用--driver-memory 4g
向驱动程序提供内存大小。
希望这可以帮助!
关于hadoop - Spark对象运行时错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/44243438/