Yesterday I tried for the first time websocketd and it is amazing!
This is a little gem created by @joewalnes that let you implement language agnostic WebSocket applications based on any command line program reading text from stdin and writing a message output to stdout. A simple (and genial) approach.
If think that it is perfect for testing too, so I updated an existing integration test for a WebSocket client of mine using a local websocketd server on TravisCI instead of relying on external public services requiring internet as done before (wss://echo.websocket.org). This activity was already on my roadmap so why not try out websocketd?!
As you can see here (https://github.com/davidemoro/pytest-play-docker/pull/42/files): in .travis.yml I added a before_script calling a travis/setup_websocket.sh script that installs and runs websocketd in background on port 8081 based on a simple travis/echo_ws.sh that reads a line from stdin echoing to stdout.
The websocketd syntax is the following:
./websocketd --port=8081 ./echo_ws.sh
were echo_ws.sh can be any executable stdin/stdout based. More details in next section.
UPDATE 20190611: the article contents are still valid but I definitively switched from pyup to the requires.io service. Why? There were an unexpected error on my pyup project and the support never sent any kind of feedback for weeks (is the project still alive?!). Anyway I happily switched to requires.io: I like the way they manage cumulative pull requests with errored packages and the support is responsive.
Let's say you want to distribute a Python tool with docker using known good dependency versions ready to be used by end users... In this article you will see how to continuously keeping up to date a Docker Hub container with minimal managing effort (because I'm a lazy guy) using github, TravisCI and pyup.
The goal was to reduce as much as possible any manual activity for
updates, check all works fine before pushing, minimize build times and
keep docker container always secure and updated with a final high quality confidence.
pyup service opens a pull request on github for pytest-play-docker to be merged on master (if you want to reduce the pull requests rate configure pyup properly with a daily or weekly policy, see pyup docs)
travis runs the build for the above pull request against the pull request branch and pull request status updated (pass or failing)
receive an email notification about a new pull requests
All tests executions run against the docker build so there is a warranty that what is pushed to Docker Hub works fine (it doesn't check only that the build was successful but it runs integration tests against the docker build), so no versions incompatibilities, no integration issues between all the integrated third party pytest-play plugins and no issues due to the operative system integration (e.g., I recently experienced an issue on alpine linux with a pip install psycopg2-binary that apparently worked fine but if you try to import psycopg2 inside your code you get an unexpected import error due to a recent issue reported here https://github.com/psycopg/psycopg2/issues/684).
docker run --rm -v $(pwd):/src davidemoro/pytest-play
you know what was the workflow for every automated docker push for pytest-play.
Acknowledgements
Many thanks to Andrea Ratto for the 10 minutes travis build speedup due to Docker cache, from ~11 minutes to ~1 minute is a huge improvement indeed! It was possible thanks to the docker pull davidemoro/pytest-play command, the build with the --cache-from davidemoro/pytest-play option and running the longest steps in a separate and cacheable step (e.g., the very very long cassandra-driver compilation moved to requirements_cassandra.txt will be executed only if necessary).
Relevant technical details about pytest-play-docker follows (some minor optimizations are still possible saving in terms of final size).
Feedback
Any feedback will be always appreciated.
Do you like the Docker hub push process for pytest-play? Let me know becoming a pytest-play stargazer!
Star
In this article we will see how to write HTTP API tests with pytest using YAML files thanks to pytest-play >= 2.0.0 (pytest-play provides support for Selenium, MQTT, SQL and more. See third party pytest-play plugins).
The guest star is Chuck Norris thanks to the public JSON endpoint available at https://api.chucknorris.io/ so you will be able to run your test by your own following this example.
Obviously this is a joke because Chuck Norris cannot fail so tests are not needed.
Inside the above link you'll find the instructions needed for installing Docker for any platform.
If you want to run this example without docker install pytest-play with the external plugin play_requests based on the fantastic requests library (play_requests is already included in docker container).
Project structure
You need:
a folder (e.g., chuck-norris-api-test)
one or more test_XXX.yml files containing your steps (test_ and .yml extension matter)
For example:
As you can see each scenario will be repeated for any item you provide in test_data structure.
The first example asserts that the categories list contains some values against this endpoint https://api.chucknorris.io/jokes/categories; the second example shows how to search for category (probably Chuck Norris will find you according to this Chuck Norris fact "You don't find Chuck Norris, Chuck Norris finds you!")
It's time to show off with a GET roundhouse kick! Ping me on twitter @davidemoro sharing your pytest-play implementation against the random Chuck Norris fact generator by category!