从 http://spark.apache.org 获取以下代码。我遇到以下错误....
代码:
JavaRDD<String> lines = sc.textFile(logFile);
JavaPairRDD<String, Integer> pairs = lines.mapToPair(s -> new Tuple2(s, 1));
错误:
-source 1.5 不支持 lambda 表达式(使用 -source 8 或更高版本启用 lambda 表达式)
但是下面的代码工作得很好。您能帮我找出原因吗?
新代码:
JavaRDD<String> lines = sc.textFile(logFile);
JavaPairRDD<String, String> prodPairs = lines.mapToPair(new PairFunction<String, String, String>() {
public Tuple2<String, String> call(String s) {
String[] prodSplit = s.split(",");
return new Tuple2<String, String>(prodSplit[2], prodSplit[0]+","+prodSplit[1]+","+prodSplit[2]);
}
});
最佳答案
你使用maven吗?
默认情况下,maven 使用源代码和目标编译为 java 1.5:
Also note that at present the default source setting is 1.5 and the default target setting is 1.5, independently of the JDK you run Maven with. If you want to change these defaults, you should set source and target as described in Setting the -source and -target of the Java Compiler. Source
您需要在pom构建编译器插件中配置:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
关于java - Spark Java错误:lambda expressions are not supported in -source 1. 5,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/40636342/