@@ -699,117 +699,37 @@ For this example, you should get a report similar to this:
699699
700700 ========================== 3 passed in 29.77 seconds ===========================
701701
702- Feature Flags
703- ~~~~~~~~~~~~~
704- labgrid includes support for feature flags on a global and target scope.
705- Adding a ``@pytest.mark.lg_feature `` decorator to a test ensures it is only
706- executed if the desired feature is available:
702+ .. _usage_pytestplugin_mark_lg_feature :
703+
704+ @pytest.mark.lg_feature()
705+ ~~~~~~~~~~~~~~~~~~~~~~~~~
706+ labgrid supports :ref: `environment-configuration-feature-flags ` in the
707+ :ref: `environment-configuration `.
708+ Adding a ``@pytest.mark.lg_feature() `` decorator to a test ensures it is only
709+ executed if the desired feature is set, either under the target or global
710+ ``features: `` keys.
707711
708712.. code-block :: python
709- :name: test_feature_flags.py
710713
711714 import pytest
712715
713716 @pytest.mark.lg_feature (" camera" )
714717 def test_camera (target ):
715718 pass
716719
717- Here's an example environment configuration:
718-
719- .. code-block :: yaml
720- :name : feature-flag-env.yaml
721-
722- targets :
723- main :
724- features :
725- - camera
726- resources : {}
727- drivers : {}
720+ In case the feature is unavailable, pytest will record the missing feature
721+ as the skip reason.
728722
729- .. testcode :: pytest-example
730- :hide:
731-
732- import pytest
733-
734- plugins = ['labgrid.pytestplugin']
735- pytest.main(['--lg-env', 'feature-flag-env.yaml', 'test_feature_flags.py'], plugins)
736-
737- .. testoutput :: pytest-example
738- :hide:
739-
740- ... 1 passed...
741-
742- This would run the above test, however the following configuration would skip the
743- test because of the missing feature:
744-
745- .. code-block :: yaml
746- :name : feature-flag-skip-env.yaml
747-
748- targets :
749- main :
750- features :
751- - console
752- resources : {}
753- drivers : {}
754-
755- .. testcode :: pytest-example
756- :hide:
757-
758- import pytest
759-
760- plugins = ['labgrid.pytestplugin']
761- pytest.main(['--lg-env', 'feature-flag-skip-env.yaml', 'test_feature_flags.py'], plugins)
762-
763- .. testoutput :: pytest-example
764- :hide:
765-
766- ... 1 skipped...
767-
768- pytest will record the missing feature as the skip reason.
769-
770- For tests with multiple required features, pass them as a list to pytest:
723+ Tests requiring multiple features are also possible:
771724
772725.. code-block :: python
773- :name: test_feature_flags_global.py
774726
775727 import pytest
776728
777729 @pytest.mark.lg_feature ([" camera" , " console" ])
778730 def test_camera (target ):
779731 pass
780732
781- Features do not have to be set per target, they can also be set via the global
782- features key:
783-
784- .. code-block :: yaml
785- :name : feature-flag-global-env.yaml
786-
787- features :
788- - camera
789- targets :
790- main :
791- features :
792- - console
793- resources : {}
794- drivers : {}
795-
796- .. testcode :: pytest-example
797- :hide:
798-
799- import pytest
800-
801- plugins = ['labgrid.pytestplugin']
802- pytest.main(['--lg-env', 'feature-flag-global-env.yaml', 'test_feature_flags_global.py'],
803- plugins)
804-
805- .. testoutput :: pytest-example
806- :hide:
807-
808- ... 1 passed...
809-
810- This YAML configuration would combine both the global and the target features.
811-
812-
813733 Test Reports
814734~~~~~~~~~~~~
815735
0 commit comments