Skip to content

cannot submit tasks to master #130

@pzg250

Description

@pzg250

Hi,
I met an issue. Anyone can help? Thanks in advance.

After deploy docker-spark to a server(192.168.10.8), I try to test by another server(192.168.10.7).
same version spark has been installed on 192.168.10.7.
cmd steps:

spark-shell --master spark://192.168.10.8:7077 --total-executor-cores 1 --executor-memory 512M
# xxxx
# some output here
# xxxx
val textFile = sc.textFile("file:///opt/spark/README.md");
textFile.first();

I got below error.(infinite loop messages)

WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions