Java spark 调用spark 算子  parallelizePairs 报错:

Incompatible equality constraint: String and T1

        List<Tuple2<String, Double>> arr2 = Arrays.asList(
                new Tuple2<String, Double>("u1", 20.01),
                new Tuple2<String, Double>("u2", 18.95),
                new Tuple2<String, Double>("u3", 20.55),
                new Tuple2<String, Double>("u4", 20.12),
                new Tuple2<String, Double>("u5", 100.11)
        );
        JavaPairRDD<String, Double> rdd2 = jsc.parallelizePairs(arr2);

代码修改成如下:

JavaPairRDD<String, Double> rdd2 = jsc.<String, Double>parallelizePairs(arr2);

 

Logo

一站式 AI 云服务平台

更多推荐