When I was doing my pre-match preparations I came away worried that this was going to be a defensive slog. Wolves were rated as the 5th best defense with Arsenal rated as the 4th best. In the end that is how things played out, with Arsenal creating just enough to score.
It was an important win for Arsenal and capped a very good match week for Arsenal’s top four chances.
Wolves 0-1 Arsenal: By the graphics
Wolves 0-1 Arsenal: By the numbers
7 – Shots from Open Play for Arsenal. Arsenal average 10.8 per match this season.
0.5 – Expected Goals (xG) from Open Play for Arsenal. Arsenal average 1.1 open play xG per match this season.
10 – Shots from Open Play for Wolves. Wolves average 7.6 per match this season.
0.6 – Expected Goals (xG) from Open Play for Wolves. Wolves average 0.8 open play xG per match this season.
Like I said in the preamble this ended up being a pretty defensive slog with two good defenses and an okay attack (Arsenal) against a not very good attack (Wolves).
41 – Clearances made by Arsenal, the most Arsenal have made in a match this season. The previous high for Arsenal this season was 33 against Watford.
42 – Crosses attempted by Wolves in this match. 35 of which came from open play cross attempts.
8 – Passes completed in the box by Wolves.
2 – Passes completed in the box that were not crossed.
One of the things that Arsenal did well in this match, especially after the Red Card was to limit access to the Penalty Box. In the end, Arsenal allowed 32 touches in the box, which is a bit worse than average (24 per match) but given that Arsenal were playing with 10 men it was a commendable performance.
More important for me was that they forced Wolves into the dreaded cross attack strategy. Mikel Arteta anticipated this threat well with the substitution of Rob Holding coming on in the 71st minute. Here is how Rob Holding did in those 20+ minutes.
9 – Clearances, led all players
6 – Headed Clearances, led all players
1 – Interception
1 – Blocked Shot
3 – Aerial Duels won
100% – Aerial Duel success rate
0 – Passes attempted
The perfect defensive performance. He came on and was the perfect player to stick in the center of defense.
2 – Goals in 2022 for Arsenal.
4 – Red Cards in 2022 for Arsenal.
141 – Minutes Arsenal have played in 2022 with 10 men. Arsenal have played just 568 minutes total in 2022.
25% – Percentage of minutes played in 2022 down a man.
103 – Minutes total that Arsenal have played with a man disadvantage in the Premier League this season, the most of any team.
23 – Minutes total that Arsenal have played with a man advantage in the Premier League this season, the 5th fewest (there are three teams that have 0) of any team.
Arsenal have been one of the more punished teams in terms of red cards. I am not one to think that there is a conspiracy going on here because if you look at the vast majority of the red cards that Arsenal have gotten in isolation they are justifiable. One of the things that does seem to happen, and perhaps is because I pay the closest attention to Arsenal, is that it doesn’t feel like Arsenal are often given the benefit of the doubt when there is room for subjectivity.
What happened in this match feels like a great example of this. Gabriel Martinelli I think made two fouls that were worthy of yellow cards. He tried to slow down a fast throw-in, then got frustrated that he failed and took down the player in a way that was much more obvious then it could have been. I think Michael Oliver generally had a good match refereeing but he also got caught up in the moment in this situation.
What makes this one especially weird is that you pretty much never see two yellows in the same sequence of play, it has happened before but it is very rare. It is these types that seem to happen to Arsenal more often (it is possible that feels this way because I focus most on Arsenal and that’s my frame), things that are within the laws of the game but are not generally enforced. I don’t know what to make of it, the red card was earned by Martinelli but still feels like it was unfair because you never see that happen.
The Lacazette Chance
12% – The probability of a goal being scored by my expected goals model
7% – The probability of a goal being scored by the Understat expected goals model
26% – The probability of a goal being scored by the Wyscout expected goals model
35% – The probability of a goal being scored by StatsBomb’s expected goals model
I don’t love looking at individual chances and going by just what the models of the expected goal say. The reason for this is that trying to measure the “true” probability of a single chance is really hard and factors that aren’t measured can have a big effect on a single chance. It is important to think about the value that is reported for the quality of a chance as the average estimate with a good amount of uncertainty where things can be better or worse.
The other thing that adds uncertainty is how accurate we are at measuring where things happen. The three images below will show the same shot but all from slightly different locations on the pitch.
One of the biggest factors that drive expected goals is where a shot takes place. With this big of a difference in locations, there is going to be differences based on just where a person coding a match thinks a shot took place.
What I think we should take away from this is that Lacazette’s chance wasn’t easy, realistically it was probably in the 1 in 4 to 1 in 3 range for being scored and secondly, people who deal with stats should help to do better to communicate the uncertainty of our ratings (this is why I have added the shaded areas to my running xG charts).
Sources: Opta via Whoscored, StatsZone, Understat, my own database. Statsbomb via FBref. Wyscout.