I saw the thread that stated that the PS team stated they have no API intentions any time soon. I ended up writing a simple tool to pull the results.json file for a given match, and then wrote another tool to parse it, all with the aim of statistically tracking my progress. This worked up until recently. I’m now getting an XML returned from attempting to pull the JSON telling me “Access Denied”. Even after updating the CSRF token used in the POST method to request the results, I still get “Access Denied”. Has something changed regarding the download file permissions for the results? ( It was really convenient having overall and all the stages in a singular file, as opposed to having to write another parser and deal with the data for each stage and overall individually using the old HTML results pages. )
Right. But I’m not asking about an API. I’m asking about file permissions essentially, or a way to simply acquire the raw data. That doesn’t require an API.
@D_MM
If you want to track your performance get the Practiscore Competitor App.
It’s what all the high level shooters who are interested in their performance use
Oh man…I bought it, and I’m not pleased…I write software for a living and that app is rooooough at best. The parsing application I wrote provides me a much deeper drill down and gives me way more flexibility in comparison options and outputs I did however see that you can import a raw file–I don’t know the extension because I can’t get back to the original page in the app…–. It was a .psc file or something? Do you know how I can acquire that export? (To the other dude, clearly there’s some sort of API support if they have their own data type you can import…)
This is an example of one of the outputs. The blue line is me, and the normal distribution was calculated using just class C shooters. I can do this with anyone in the match against any given class, overall, or given division. The distributions are calculated and binned conditionally based on the comparison pool.
I can compare multiple shooters on the same style plots as well with different colors for different people–great for team comparison.
One thing I don’t have is the stacking bar graph for shot type–A, B, C, etc…Definitely gonna add that in!
Lol that meme is fair. A lot of stats gets used that way sadly.
So, I get a lot out of this in more ways than one. I have B.S. degrees in (astro)physics and mathematics. I like to apply my arts to my hobbies–real world apps–where I can. Shooting and improvement of shooting is perfect for me.
How does this data help me improve? One example of how it has, is by providing objective evidence that I generally have enough accuracy to sacrifice it for speed, ultimately driving up hit factor, given I don’t rack up penalties. I’ve been doing so, and it’s been giving awesome results. ( I sit down after every match, do my statistics, and write a subjective review in a notebook, and what I think I need to work on next match. )
Beyond this, with more data points, I can do a multitude of things, such as aggregate data by stage type and answer more specific questions like, “I notice I have lower HF’s on short stages: where can I improve there?” Or “How can I optomize my performance on long, high point count stages?” and have quantitative evidence to back the claims.
TL;DR It allows me to have fun applying my arts and in doing so, provide a highly specified profile of what can be done to improve. It does require a working knowledge of statistics to be useful, but I have that, and it’s simple enough I feel many could grasp given the desire.
You can pretty much tell that without charts or making comparison with other people. Just look at the stage-level accuracy (or accuracy % for the match).
From the statistics point of view, when looking or comparing with other competitors you have to have some baseline info about these competitors. E.g. skills of a two C-class shooters may be very different (and even so on different stages) and the whole thing is a running target depending who showed up for the match.
In most cases, you yourself is the only benchmark you can rely on.
And as for sacrificing accuracy over speed or vice versa to drive HF up, it depend more on the stage’s potential top HF than skills of individual shooters. The high HF stages tend to forgive inaccuracy and low HF stages are mode demanding for the accuracy and points.
Neither time or points alone is the driving factor. They have to be looked at together. A lower HF may not necessary a bad thing on a given stage - it is just the way stage is set.
Personally I find that looking at the individual transition and split times (available from connected shot timers) gives me more clue about individual skill improvement.
Of course this can be done without charts and mathematics, but as I said: I’m practicing my art while practicing my hobby. I value quantitative evidence over qualitative all day anyways.
But nevertheless, this is a complete digression from the original question: Did the file permissions on the results.json change or something? I think it stands to reason that making a single file read accessible is hardly considered an API…(Again, I’m really just being lazy…I don’t wanna scrape a page per stage/feed 5-6 URLs, when I had it so good in one file. )