Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[🐛 Bug]: docker-selenium does not start correctly due to exit of novnc on openEuler #2438

Closed
BlackManne opened this issue Oct 22, 2024 · 26 comments · Fixed by #2443
Closed

Comments

@BlackManne
Copy link

What happened?

I tried to launch a docker-selenium container on an openEuler machine. The selenium service is a part of a python project of 4 containers, including main container(called sbom), elasticsearch, mongodb and selenium. The selenium service seems to run correctly because it can be shown by command "docker ps", but when I try to invoke it by http request, an error occurs(message: connection refused) . And when I execute "docker exec selenium curl http://localhost:4444/wd/hub/status", it fails either.

Command used to start Selenium Grid with Docker (or Kubernetes)

# docker-compose.yaml
services:
  elasticsearch:
    image: docker.m.daocloud.io/library/elasticsearch:8.12.1
    container_name: elasticsearch
    environment:
      - discovery.type=single-node
      - network.host=0.0.0.0
    ports:
      - "9200:9200"
      - "9300:9300"
    volumes:
      - es_data:/usr/share/elasticsearch/data
    networks:
      - sbom_network

  mongodb:
    image: docker.m.daocloud.io/library/mongo:4.4
    container_name: mongodb
    ports:
      - "27017:27017"
    volumes:
      - mongodb_data:/data/db
    networks:
      - sbom_network

  selenium:
    image: selenium/standalone-chromium:125.0
    shm_size: 2gb
    container_name: selenium
    ports:
      - 4444:4444
      - 7900:7900
      - 5900:5900
    networks:
      - sbom_network

  sbom:
    build:
      context: .
      dockerfile: dockerfile
    container_name: sbom
    depends_on:
      - elasticsearch
      - mongodb
      - selenium
    ports:
      - "5000:5000"
    networks:
      - sbom_network

networks:
  sbom_network:
    driver: bridge
    ipam:
      config:
        - subnet: 172.18.0.0/16

volumes:
  es_data:
  mongodb_data:

#usage in python project
options = Options()
options.add_argument('--headless')
driver = webdriver.Remote(command_executor='http://selenium:4444/wd/hub', options=options)
driver.get(url)

Relevant log output

#docker logs selenium
2024-10-22 05:28:35,839 INFO Included extra file "/etc/supervisor/conf.d/chrome-cleanup.conf" during parsing
2024-10-22 05:28:35,839 INFO Included extra file "/etc/supervisor/conf.d/selenium.conf" during parsing
2024-10-22 05:28:35,874 INFO RPC interface 'supervisor' initialized
2024-10-22 05:28:35,874 CRIT Server 'unix_http_server' running without any HTTP authentication checking
2024-10-22 05:28:35,875 INFO supervisord started with pid 9
2024-10-22 05:28:36,878 INFO spawned: 'xvfb' with pid 10
2024-10-22 05:28:36,880 INFO spawned: 'vnc' with pid 11
2024-10-22 05:28:36,882 INFO spawned: 'novnc' with pid 12
2024-10-22 05:28:36,886 INFO spawned: 'selenium-standalone' with pid 13
2024-10-22 05:28:36,891 INFO success: xvfb entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-22 05:28:36,892 INFO success: vnc entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-22 05:28:36,892 INFO success: novnc entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-22 05:28:36,892 INFO success: selenium-standalone entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-22 05:28:36,905 INFO exited: novnc (exit status 1; not expected)

#docker exec selenium curl http://localhost:4444/wd/hub/status
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
curl: (7) Failed to connect to localhost port 4444 after 0 ms: Connection refused

Operating System

openEuler 22.03 LTS for ARM

Docker Selenium version (image tag)

125.0

Selenium Grid chart version (chart version)

No response

Copy link

@BlackManne, thank you for creating this issue. We will troubleshoot it as soon as we can.


Info for maintainers

Triage this issue by using labels.

If information is missing, add a helpful comment and then I-issue-template label.

If the issue is a question, add the I-question label.

If the issue is valid but there is no time to troubleshoot it, consider adding the help wanted label.

If the issue requires changes or fixes from an external project (e.g., ChromeDriver, GeckoDriver, MSEdgeDriver, W3C), add the applicable G-* label, and it will provide the correct link and auto-close the issue.

After troubleshooting the issue, please add the R-awaiting answer label.

Thank you!

@BlackManne BlackManne changed the title [🐛 Bug]: docke-selenium does not start correctly due to exit of novnc on openEuler [🐛 Bug]: docker-selenium does not start correctly due to exit of novnc on openEuler Oct 22, 2024
@VietND96
Copy link
Member

I saw pid novnc failed but don't have much info to understand what could be the issue
Can you try to set env var START_NO_VNC=false to see if others can be up?

@BlackManne
Copy link
Author

I tried this method, here is the result:
#docker logs selenium
2024-10-22 07:59:18,398 INFO Included extra file "/etc/supervisor/conf.d/chrome-cleanup.conf" during parsing
2024-10-22 07:59:18,398 INFO Included extra file "/etc/supervisor/conf.d/selenium.conf" during parsing
2024-10-22 07:59:18,523 INFO RPC interface 'supervisor' initialized
2024-10-22 07:59:18,523 CRIT Server 'unix_http_server' running without any HTTP authentication checking
2024-10-22 07:59:18,524 INFO supervisord started with pid 9
2024-10-22 07:59:19,526 INFO spawned: 'xvfb' with pid 10
2024-10-22 07:59:19,528 INFO spawned: 'vnc' with pid 11
2024-10-22 07:59:19,530 INFO spawned: 'novnc' with pid 12
2024-10-22 07:59:19,533 INFO spawned: 'selenium-standalone' with pid 13
2024-10-22 07:59:19,567 INFO success: xvfb entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-22 07:59:19,567 INFO success: vnc entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-22 07:59:19,567 INFO success: novnc entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-22 07:59:19,567 INFO success: selenium-standalone entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-22 07:59:19,568 INFO exited: novnc (exit status 0; expected)

@BlackManne
Copy link
Author

novnc exited as well, but the status turned from 1 to 0

@VietND96
Copy link
Member

Without novnc, selenium server can respond in your test?

@BlackManne
Copy link
Author

Stll, it cannot respond. Here is the Error Message:
#docker exec selenium curl http://localhost:4444/wd/hub/status
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (7) Failed to connect to localhost port 4444 after 0 ms: Connection refused

#invoke by http request:
File "/SBOM/ExternalSearchers/nvd_searcher.py", line 26, in selenium_parse
driver = webdriver.Remote(command_executor='http://selenium:4444/wd/hub', options=options)
File "/usr/local/lib/python3.9/site-packages/selenium/webdriver/remote/webdriver.py", line 208, in init
self.start_session(capabilities)
File "/usr/local/lib/python3.9/site-packages/selenium/webdriver/remote/webdriver.py", line 292, in start_session
response = self.execute(Command.NEW_SESSION, caps)["value"]
File "/usr/local/lib/python3.9/site-packages/selenium/webdriver/remote/webdriver.py", line 345, in execute
response = self.command_executor.execute(driver_command, params)
File "/usr/local/lib/python3.9/site-packages/selenium/webdriver/remote/remote_connection.py", line 300, in execute
return self._request(command_info[0], url, body=data)
File "/usr/local/lib/python3.9/site-packages/selenium/webdriver/remote/remote_connection.py", line 321, in _request
response = self._conn.request(method, url, body=body, headers=headers)
File "/usr/local/lib/python3.9/site-packages/urllib3/_request_methods.py", line 143, in request
return self.request_encode_body(
File "/usr/local/lib/python3.9/site-packages/urllib3/_request_methods.py", line 278, in request_encode_body
return self.urlopen(method, url, **extra_kw)
File "/usr/local/lib/python3.9/site-packages/urllib3/poolmanager.py", line 443, in urlopen
response = conn.urlopen(method, u.request_uri, **kw)
File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 873, in urlopen
return self.urlopen(
File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 873, in urlopen
return self.urlopen(
File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 873, in urlopen
return self.urlopen(
File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 843, in urlopen
retries = retries.increment(
File "/usr/local/lib/python3.9/site-packages/urllib3/util/retry.py", line 519, in increment
raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type]
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='selenium', port=4444): Max retries exceeded with url: /wd/hub/session (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffff7437f280>: Failed to establish a new connection: [Errno 111] Connection refused'))

@VietND96
Copy link
Member

Can you try with image selenium/standalone-chromium:latest?

@BlackManne
Copy link
Author

BlackManne commented Oct 25, 2024

I already tried that...Still cannot work.
In fact I have the same problem with this issue #2416

Some outputs are listed as follows:

docker exec -it selenium bash

Warning: could not find self.pem
The path /opt/bin/noVNC/utils/websockify exists, but /opt/bin/noVNC/utils/websockify/run either does not exist or is not executable.
If you intended to use an installed websockify package, please remove /opt/bin/noVNC/utils/websockify.

seluser@887b879fb410:/$ /opt/bin/noVNC/utils/websockify/run
/opt/bin/noVNC/utils/websockify/websockify/websocket.py:31: UserWarning: no 'numpy' module, HyBi protocol will be slower
warnings.warn("no 'numpy' module, HyBi protocol will be slower")
Usage:
main.py [options] [source_addr:]source_port target_addr:target_port
main.py [options] --token-plugin=CLASS [source_addr:]source_port
main.py [options] --unix-target=FILE [source_addr:]source_port
main.py [options] [source_addr:]source_port -- WRAP_COMMAND_LINE

main.py: error: Too few arguments

@VietND96
Copy link
Member

What is docker version that you are using in your host?

@BlackManne
Copy link
Author

BlackManne commented Oct 25, 2024

My docker version is:
Docker version 26.1.4, build 5650f9b

@VietND96
Copy link
Member

I think I found something and fixed. Deploying a preview image for your verification before publish the fix. Will inform you once done

@BlackManne
Copy link
Author

BlackManne commented Oct 25, 2024

I noticed that this issue is closed. Should I get and use the latest image for solution?

@BlackManne
Copy link
Author

Many thanks to you!!Looking forward to your message:)

@VietND96
Copy link
Member

@BlackManne, can you use Nightly build selenium/standalone-chromium:nightly to verify and confirm?

@BlackManne
Copy link
Author

It seems it doesn't work...and I got more errors when executing /opt/bin/noVNC/utils/websockify/run:
OpenBLAS blas_thread_init: pthread_create failed for thread 1 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 2 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 3 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 4 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 5 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 6 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 7 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 8 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 9 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 10 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 11 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 12 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 13 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 14 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 15 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 16 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 17 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 18 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 19 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 20 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 21 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 22 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 23 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 24 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 25 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 26 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 27 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 28 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 29 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 30 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 31 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 32 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 33 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 34 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 35 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 36 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 37 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 38 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 39 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 40 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 41 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 42 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 43 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 44 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 45 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 46 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 47 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 48 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 49 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 50 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 51 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 52 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 53 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 54 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 55 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 56 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 57 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 58 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 59 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 60 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 61 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 62 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
OpenBLAS blas_thread_init: pthread_create failed for thread 63 of 64: Operation not permitted
OpenBLAS blas_thread_init: RLIMIT_NPROC -1 current, -1 max
/usr/local/lib/python3.12/dist-packages/websockify-0.12.0-py3.12.egg/websockify/websocket.py:31: UserWarning: no 'numpy' module, HyBi protocol will be slower
warnings.warn("no 'numpy' module, HyBi protocol will be slower")
Usage:
main.py [options] [source_addr:]source_port target_addr:target_port
main.py [options] --token-plugin=CLASS [source_addr:]source_port
main.py [options] --unix-target=FILE [source_addr:]source_port
main.py [options] [source_addr:]source_port -- WRAP_COMMAND_LINE

main.py: error: Too few arguments

@BlackManne
Copy link
Author

BlackManne commented Oct 25, 2024

And I got same errors as using tag "latest" when I execute following commands:

docker exec -it selenium bash

seluser@834a4954d3f0:/$ /opt/bin/start-novnc.sh
Warning: could not find self.pem
The path /opt/bin/noVNC/utils/websockify exists, but /opt/bin/noVNC/utils/websockify/run either does not exist or is not executable.
If you intended to use an installed websockify package, please remove /opt/bin/noVNC/utils/websockify.

docker logs selenium

2024-10-25 04:56:16,539 INFO Included extra file "/etc/supervisor/conf.d/chrome-cleanup.conf" during parsing
2024-10-25 04:56:16,539 INFO Included extra file "/etc/supervisor/conf.d/selenium.conf" during parsing
2024-10-25 04:56:16,544 INFO RPC interface 'supervisor' initialized
2024-10-25 04:56:16,545 INFO supervisord started with pid 9
2024-10-25 04:56:17,548 INFO spawned: 'xvfb' with pid 10
2024-10-25 04:56:17,549 INFO spawned: 'vnc' with pid 11
2024-10-25 04:56:17,551 INFO spawned: 'novnc' with pid 12
2024-10-25 04:56:17,553 INFO spawned: 'selenium-standalone' with pid 13
2024-10-25 04:56:17,571 INFO success: xvfb entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 04:56:17,571 INFO success: vnc entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 04:56:17,572 INFO success: novnc entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 04:56:17,572 INFO success: selenium-standalone entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 04:56:17,572 WARN exited: novnc (exit status 1; not expected)

@VietND96
Copy link
Member

Ok, at least in recent change it solved this message

The path /opt/bin/noVNC/utils/websockify exists, but /opt/bin/noVNC/utils/websockify/run either does not exist or is not executable.
If you intended to use an installed websockify package, please remove /opt/bin/noVNC/utils/websockify.

I suspect something relates to platform of your host also. Since I running the container on my devbox is Ubuntu 22.04, didn't see those error while container is running

@BlackManne
Copy link
Author

BlackManne commented Oct 25, 2024

Ok, at least in recent change it solved this message

The path /opt/bin/noVNC/utils/websockify exists, but /opt/bin/noVNC/utils/websockify/run either does not exist or is not executable.
If you intended to use an installed websockify package, please remove /opt/bin/noVNC/utils/websockify.

I suspect something relates to platform of your host also. Since I running the container on my devbox is Ubuntu 22.04, didn't see those error while container is running

I feel sorry but maybe not? Because I still got this error message when executing "/opt/bin/start-novnc.sh".
And is there any alternative or other solutions?It is a must for me to use docker-selenium on openEuler.
I am sorry for the inconvenience. Many thanks.

@VietND96
Copy link
Member

Can you try to start the container with both env vars SE_START_NO_VNC=false and SE_START_VNC=false

@VietND96
Copy link
Member

Also, any options else set in your docker compose command? Or just simple docker compose up -d ?

@BlackManne
Copy link
Author

Can you try to start the container with both env vars SE_START_NO_VNC=false and SE_START_VNC=false

I tried, here is the result:

[root@DC4-73-003 sbom]# docker logs selenium
2024-10-25 05:51:40,765 INFO Included extra file "/etc/supervisor/conf.d/chrome-cleanup.conf" during parsing
2024-10-25 05:51:40,765 INFO Included extra file "/etc/supervisor/conf.d/selenium.conf" during parsing
2024-10-25 05:51:40,770 INFO RPC interface 'supervisor' initialized
2024-10-25 05:51:40,771 INFO supervisord started with pid 9
2024-10-25 05:51:41,773 INFO spawned: 'xvfb' with pid 10
2024-10-25 05:51:41,775 INFO spawned: 'vnc' with pid 11
2024-10-25 05:51:41,777 INFO spawned: 'novnc' with pid 12
2024-10-25 05:51:41,779 INFO spawned: 'selenium-standalone' with pid 13
2024-10-25 05:51:41,780 INFO success: xvfb entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 05:51:41,781 INFO success: vnc entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 05:51:41,781 INFO success: novnc entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 05:51:41,781 INFO success: selenium-standalone entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 05:51:41,781 INFO exited: vnc (exit status 0; expected)
2024-10-25 05:51:41,782 INFO exited: novnc (exit status 0; expected)

@BlackManne
Copy link
Author

Also, any options else set in your docker compose command? Or just simple docker compose up -d ?

just 'docker compose up -d'

@VietND96
Copy link
Member

In your logs, is there nothing else after exit status 0? Here are logs that I could see

docker run --rm -p 4444:4444 -e SE_ENABLE_TRACING=false -e SE_START_VNC=false -e SE_START_NO_VNC=false selenium/standalone-chromium:nightly

2024-10-25 14:04:29,182 INFO Included extra file "/etc/supervisor/conf.d/chrome-cleanup.conf" during parsing
2024-10-25 14:04:29,182 INFO Included extra file "/etc/supervisor/conf.d/selenium.conf" during parsing
2024-10-25 14:04:29,186 INFO RPC interface 'supervisor' initialized
2024-10-25 14:04:29,187 INFO supervisord started with pid 8
2024-10-25 14:04:30,190 INFO spawned: 'xvfb' with pid 9
2024-10-25 14:04:30,192 INFO spawned: 'vnc' with pid 10
2024-10-25 14:04:30,194 INFO spawned: 'novnc' with pid 11
2024-10-25 14:04:30,196 INFO spawned: 'selenium-standalone' with pid 13
2024-10-25 14:04:30,197 INFO success: xvfb entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 14:04:30,198 INFO success: vnc entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 14:04:30,198 INFO success: novnc entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 14:04:30,198 INFO success: selenium-standalone entered RUNNING state, process has stayed up for > than 0 seconds (startsecs)
2024-10-25 14:04:30,198 INFO exited: vnc (exit status 0; expected)
2024-10-25 14:04:30,199 INFO exited: novnc (exit status 0; expected)
Appending Selenium option: --heartbeat-period 30
Appending Selenium option: --log-level INFO
Appending Selenium option: --http-logs false
Appending Selenium option: --structured-logs false
Appending Selenium option: --reject-unsupported-caps true
Setting up SE_NODE_GRID_URL...
Selenium Grid Standalone configuration: 
[network]
relax-checks = true

[node]
session-timeout = "300"
override-max-sessions = false
detect-drivers = false
drain-after-session-count = 0
max-sessions = 1

[[node.driver-configuration]]
display-name = "chrome"
stereotype = '{"browserName": "chrome", "browserVersion": "130.0", "platformName": "Linux", "goog:chromeOptions": {"binary": "/usr/bin/chromium"}, "se:containerName": ""}'
max-sessions = 1

Starting Selenium Grid Standalone...
Appending Selenium option: --tracing false
Tracing is disabled
Oct 25, 2024 2:04:30 PM org.openqa.selenium.grid.config.TomlConfig <init>
WARNING: Please use quotes to denote strings. Upcoming TOML parser will require this and unquoted strings will throw an error in the future
14:04:30.968 INFO [LoggingOptions.configureLogEncoding] - Using the system default encoding
14:04:30.971 INFO [LoggingOptions.getTracer] - Using null tracer
14:04:31.477 INFO [LoggingOptions.getTracer] - Using null tracer
14:04:31.487 INFO [NodeOptions.getSessionFactories] - Detected 8 available processors
14:04:31.544 INFO [NodeOptions.report] - Adding chrome for {"browserName": "chrome","browserVersion": "130.0","goog:chromeOptions": {"binary": "\u002fusr\u002fbin\u002fchromium"},"platformName": "linux","se:containerName": ""} 1 times
14:04:31.567 INFO [Node.<init>] - Binding additional locator mechanisms: relative
14:04:31.593 INFO [GridModel.setAvailability] - Switching Node 8c3c3ff4-ba52-4dbf-8f87-153f879fcd9d (uri: http://172.17.0.2:4444) from DOWN to UP
14:04:31.593 INFO [LocalDistributor.add] - Added node 8c3c3ff4-ba52-4dbf-8f87-153f879fcd9d at http://172.17.0.2:4444. Health check every 120s
14:04:31.729 INFO [Standalone.execute] - Started Selenium Standalone 4.26.0-SNAPSHOT (revision fbdc7a5): http://172.17.0.2:4444

@BlackManne
Copy link
Author

No, there is nothing after exit status 0. It seems that the logs you've got is correct and my logs showed error...
I user docker compose to use docker-selenium, here is part of my docker-compose.yaml:
selenium:
image: selenium/standalone-chromium:nightly
shm_size: 2gb
container_name: selenium
ports:
- 4444:4444
- 7900:7900
- 5900:5900
ulimits:
nofile:
soft: 32768
hard: 32768
environment:
- SE_START_NO_VNC=false
- SE_START_VNC=false
networks:
- sbom_network

@BlackManne
Copy link
Author

BlackManne commented Oct 25, 2024

Openeuler is an aarch64 operating system. In fact I already tried docker-selenium on my ubuntu, no error occured and everthing went well. That's the most confusing point for me :(

@VietND96
Copy link
Member

In host using Openeuler can you try to use docker engine latest version (v27.3.1) or older version (e.g v24.0.9) to see any help?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants