Foreword
Pytest can support custom tags, which can divide a web project into multiple modules, and then specify the name of the module to execute. When app automation, if you want to share a set of code between android and ios,
you can also use the mark function to indicate which are ios use cases and which are android. When running the code, specify the mark name and run it.
< h1 id="mark tag">mark tag
1. The following use case, mark test_send_http() as webtest
# content of test_server.py
import pytest
@pytest.mark.webtest
def test_send_http():
pass # perform some webtest test for your app
def test_something_quick():
pass
def test_another():
pass
class TestClass:
def test_method(self):
pass< br />
if __name__ == "__main__":
pytest.main(["-s", "test_server.py", "-m=webtest"])
Run only tests marked with webtest. When cmd is running, add the -m parameter to specify the parameter value webtest
$ pytest -v -m webtest
============================= test session starts ================= ============
platform win32 - Python 3.6.0, pytest-3.6.3, py-1.5.4, pluggy-0.6.0
rootdir: E: \YOYO\se, inifile:
plugins: metadata-1.7.0, html-1.19.0
collected 4 items / 3 deselecte d
test_server.py .
=================== 1 passed, 3 deselected in 0.10 seconds == ==================
If you don’t want to implement the use case for marking webtest, use “not webtest”
$ pytest -v -m “not webtest”
import pytest
@pytest.mark.webtest
def test_send_http():
pass # perform some webtest test for your app
def test_something_quick():
pass
def test_another():
pass
class TestClass:
def test_method (self):
pass
if __name__ == "__main__":
pytest.main(["-s", "test_server.py", "-m=' not webtest'"])
Run results
============================ = test session starts ============================
platform win32 - Python 3.6.0, pytest-3.6. 3, py-1.5.4, pluggy-0.6.0
rootdir: E:\YOYO\se, inifile:
plugins: metadata-1.7.0, html-1.19.0
collected 4 items
test_server.py ....
========================= = 4 passed in 0.06 seconds ===========================
-v specified function Node id
If you want to specify a use case in the class to run under a certain .py module, such as: test_method use case in TestClass
Use cases and functions at the beginning of each test_ (or at the end of _test) The name of the (or method) is the node id of the use case. Specify the node id to run with the -v parameter
$ pytest -v test_server.py::TestClass::test_method
pycharm running code
if __name__ == "__main__":
pytest.main(["-v", "test_server.py::TestClass::test_method"])< /pre>Run results
============================= test session starts === =========================
platform win32 - Python 3.6.0, pytest-3.6.3, py-1.5.4 , pluggy-0.6.0 - E:\python36\python.exe
cachedir: .pytest_cache
metadata: {'Python': '3.6.0','Platform':'Windows-10- 10.0.17134-SP0','Packages': {'pytest': '3.6.3','py': '1.5.4','pluggy': '0.6.0'},'Plugins': {'metadata ': '1.7.0','html': '1.19.0'},'JAVA_HOME':'D:\\java\\jdk17'}
rootdir: E:\YOYO\se, inifile:< br />plugins: metadata-1.7.0, html-1.19.0
collecting ... collected 1 item
test_server.p y::TestClass::test_method PASSED [100%]
========================= 1 passed in 0.06 seconds ===========================Of course, you can also choose to run the entire class
< p>$ pytest -v test_server.py::TestClass
You can also select multiple nodes to run, separated by spaces between multiple nodes
$ pytest -v test_server.py::TestClass test_server.py::test_send_http
pycharm running reference
if __name__ == "__main__":
pytest.main (["-v", "test_server.py::TestClass", "test_server.py::test_send_http"])-k match case name
You can use the -k command line option to specify the expression that matches the case name
$ pytest -v -k http
$ pytest- v -k http # running with the above defined example module
=========================== test session starts ==== ========================
platform linux - Python 3.xy, pytest-3.xy, py-1.xy, pluggy -0.xy - $PYTHON_
?→PREFIX/bin/python3.5
cachedir: .pytest_cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 4 items / 3 deselected
test_server.py::test_send_http PASSED [100%]
================= = 1 passed, 3 deselected in 0.12 seconds ==================You can also run all tests and exclude certain use cases based on the use case name :
$ pytest -k "not send_http" -v
================== ========= test session starts ===========================
platform linux - Python 3 .xy, pytest-3.xy, py-1.xy, pluggy-0.xy - $PYTHON_
?→PREFIX/bin/python3.5
cachedir: .pytest_cache
rootdir : $REGENDOC_TMPDIR, inifile:
collecting ... collected 4 items / 1 deselected
test_server.py::test_something_quick PASSED [33%]
test_server.py::test_another PASSED [66%]
test_server.py::TestClass::test_method PASSED [100%]
================= 3 passed, 1 deselected in 0.12 seconds == ================You can also choose to match "http" and "quick" at the same time
$ pytest -k " http or quick" -v
=========================== test se ssion starts ============================
platform linux - Python 3.xy, pytest-3.xy, py -1.xy, pluggy-0.xy - $PYTHON_
?→PREFIX/bin/python3.5
cachedir: .pytest_cache
rootdir: $REGENDOC_TMPDIR, inifile:
collecting ... collected 4 items / 2 deselected
test_server.py::test_send_http PASSED [50%]
test_server.py::test_something_quick PASSED [100%]
====== ============ 2 passed, 2 deselected in 0.12 seconds ==================< /p>