ERROR org.apache.spark.rpc.netty.Inbox: Ignoring error org.apache.spark.SparkException: Could not find CoarseGrainedScheduler
suggestions in prompt
spark-submit -conf spark.dynamicAllocation.enabled=false
I didn't set the above parameters by default are already false
There is also a plan to increase
num-executors, which has been set to 100. It feels that is enough.
After general debugging, it was found that the spark task generated too few tasks, and the number of Executors specified when the task was submitted was too large, so the problem of reducing the
--num-executors parameter was solved.
Finally set to