Spark疑问1之如何查看sparkContext没有关闭的sc

阅读 39

2023-01-04



Spark疑问1之如何查看sparkContext没有关闭的

在跑完spark程序后有时会忘记执行sc.stop


hadoop@Master:~/cloud/testByXubo/spark/hs38DH/package$ ./cluster.sh 
fq0.count:105887
Method 1=> Length:2971 sum:7888989 time:34715ms
Method 2=> Length:2971 sum2:7888989 time:5665ms
Method 3=> Length:2971 sum3:7888989.0 time:1362ms
16/04/17 14:15:21 WARN QueuedThreadPool: 2 threads could not be stopped


如何查看集群中没有关闭的sc的数量?

这会不会影响集群的内存或者其他性能?

精彩评论(0)

0 0 举报