Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make changes to 3 PSA tests that are causing the build to fail #1862

Merged
merged 1 commit into from Apr 18, 2019

Conversation

jafingerhut
Copy link
Contributor

Two of them are passing now, and should be removed from the XFAIL list.

One of them was written in a way that it was passing, but it should
not have been. Updated the STF test so it fails. A separate PR on
behavioral-model code is required to make that test pass.

Two of them are passing now, and should be removed from the XFAIL list.

One of them was written in a way that it was passing, but it should
not have been.  Updated the STF test so it fails.  A separate PR on
behavioral-model code is required to make that test pass.
@jafingerhut
Copy link
Contributor Author

@derekso1 FYI

@jafingerhut
Copy link
Contributor Author

I believe that until this PR is merged, all other PRs will fail the automated tests.

Until a recent change made to behavioral-model for implementing more of the PSA architecture, several STF tests for PSA programs in p4c were marked as XFAIL, and with that recent behavioral-model change several of them are working now, so should no longer be XFAIL.

# This packet should be dropped. We send it into port 0, because if
# there are no packets sent into nor expected on port 0, then the test
# infrastructure does not check any of the packets that come out port
# 0, or whether the right number come out port 0.
Copy link
Contributor

@peteli3 peteli3 Apr 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ahh we were unaware of this. So it would've passed since this check because the "expect" check wouldn't have happened anyway.

Is this specific to port 0; do you know if there are other ports that have this behavior? @jafingerhut

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not specific to port 0. If there is a port number mentioned in any packet or expect statement of an STF file, then the test infrastructure will already do complete checking of every packet that comes out of that port, or does not come out of that port but is expected to.

If a port number is mentioned in no packet or expect statement of an STF file at all, the test infrastructure will do absolutely no checking of any packets that come out of those ports, whether 0 packets come out, or 1000.

@peteli3
Copy link
Contributor

peteli3 commented Apr 18, 2019

added a question to Andy's comment in CMakeLists.txt
but lgtm

Copy link
Contributor

@mihaibudiu mihaibudiu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will merge this because I trust Andy and it blocks other tests from succeeding.

@mihaibudiu mihaibudiu merged commit 45707c3 into p4lang:master Apr 18, 2019
peteli3 pushed a commit to peteli3/p4c that referenced this pull request Jun 6, 2019
…g#1862)

Two of them are passing now, and should be removed from the XFAIL list.

One of them was written in a way that it was passing, but it should
not have been.  Updated the STF test so it fails.  A separate PR on
behavioral-model code is required to make that test pass.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants