integ test ci imagepull err thread
# 💻|contributing
I got a silly level log of the incident here: There are a few things that are of interest to me
Although the status resolves as ready it still builds the build a few lines down the line:
Copy code
ℹ build.missing-sh [debug]  → Image missing-sh:v-bda89cfaa0 already exists
ℹ build.missing-sh          → Build type=container name=missing-sh (from module missing-sh) already complete, nothing to do.
ℹ build.missing-sh [silly]  → Processing node build.missing-sh:process
ℹ build.missing-sh [silly]  → Executing node build.missing-sh:process
ℹ build.missing-sh [silly]  → Completing node build.missing-sh:process. aborted=false, error=null
@alert-helicopter-61082 any ideas?
Those log lines don't mean that the build is happening, just that the "process" node in the graph is being executed, which will promptly return if the status was previously resolved as ready. If an actual build was happening, you'd see a lot more logs to that effect.
The actual issue does seem to be simply that the image isn't available to the cluster, despite the status check indicating that it is. So the first thing I'd like to know is whether the image is there after the test fails.
I made some progress > I'd like to know is whether the image is there after the test fails It is not. Re-running the same test again creates the image. The log from the ci has the line
Image missing-sh:v-bda89cfaa0 already exists
which comes from here [1]. So the
that's called above returns an identifier, but when I run
docker images missing-sh:v-bda89cfaa0 -q
manually after the first test run it returns nothing. The
call should this line [2], but if you look at the logs from earlier it's nowhere to be found, so something goes wrong along the way there. It is logged on a second attempt and also when I run locally. I think this is where the problem lies. 1. 2.
Ok nice. That must be the issue then. Think you can debug that further?
I have high hopes I can figure this out today
Figured it out: It was a mock leaking out of it's scope (written by me \: ) Here's a green ci [1] But after rebasing I'm getting timeouts again [2] (I don't yet know what they are as it's taking too long and I want to go to sleep, but it shouldn't bee too hard) 1. 2.
Figured out the timeouts as well. They are caused by tests where an action is expected to fail and
is set to
for the task processing [1]. Later on when the action is resolved [2] the solver gets stuck in an infinite loop. Can be reproduced easily with tests like this [3] @alert-helicopter-61082 can you have a look at this? If this is fixed we'd be really close to a green integ suite cc @swift-garage-61180 1. 2. 3.
@alert-helicopter-61082 ping. I'd love to get the integ tests passing but I can't quite figure out the solver thing ^
I may be able to look at this tomorrow
But that's as much of a commitment as I can give, unfortunately 😬
@alert-helicopter-61082 if you have time could you take a look. Then I could take a look at the integ tests
Not sure when I will tbh. The broken test suites now seem to just be kind not working... That looks to be completely separate.
Could we have an introduction session on the solver, then I would feel more comfortable digging in on my own
We can do that, not sure it has anything to do with the solver directly though
It does, turns out
or... hmm. we'll see, this is tricky
Yeah, solver it was. I worked it out.
More issues remain, but the solver issue seems to be solved
That testdouble issue (
not working) seems to be a lot of what's wrong in CI. So that's fun.
Very puzzling that this only happens in CI... Trying a simple version upgrade on the lib to start.