It would be great to have a badge for Android Play store. Ideally it would show number of downloads, number of reviews, or current star rating.
Interesting!
I feel this could be part of social buttons, as well: https://github.com/badges/shields/issues/337
There’s sadly no API for any of that, so would either have to manually scrape the web page, or use an unoffical API like mevvy
In #649, @espadrine posted:
For Google Play, there's this thing that seems related, which we might be able to use to copy functionality: https://code.google.com/archive/p/android-market-api/.
There still doesn't seem to be any official API to retrieve information about a given app on the Play Store.
The _android-market-api_ project linked above was last updated in early 2012, it would probably be a lot of work to adapt it to the current marketplace. Nevertheless, I found a few unofficial projects that seem to be more actively maintained. In particular the following repository:
https://github.com/facundoolano/google-play-scraper
It provides the number of downloads, the number of reviews and the rating of an Android app, which were the three badges initially requested by this issue. The project seems to be fairly popular, has been under sustained development for around two and a half years and is open-source.
Would it be a good idea to use that module in shields in order to create the requested badges? I'm guessing it all boils down to discussions similar to the ones in #990: as with the Chrome Webstore and proposed Opera badges, _google-play-scraper_ parses web pages by using cheerio underneath the hood. Nevertheless it would probably benefit the shields project to add Android badges, and therefore I would be interested in giving this task a go if the proposed solution is deemed good enough. 👍
@paulmelnikow : any thoughts regarding the above? 😉
I don't have exact numbers, but my impression is that Shields gets by on a tiny hosting budget, relying on optimized code, and avoiding any compute-intensive work. See this conversation on Twitter.
I would love to add features like this, which are relatively computationally expensive. I'd also like to have lower response times, elastic capacity, backup when a hosting provider goes down, and more .9's of reliability. However I don't think we can do both – or even either of those – without increasing our hosting budget and/or finding a provider who will donate hosting. A lot of people like using Shields, so I'm optimistic we could raise funds. But we won't know for sure until we try!
Related: #1250
We need more capacity, or confidence we can buy more capacity, before we add more scrapers.
This makes sense. Nevertheless I feel that we're in a quite "opaque" situation here, as we don't really know what the numbers are. There are a lot of things changing in the project at the moment, we ought to be more aware of our underlying capacity.
I really enjoy doing performance tuning and optimisation, but unfortunately I have the feeling my skills in Javascript are too limited to do anything useful in this area.
Yea… I'm with you on that. More transparency would help a lot.
I really enjoy doing performance tuning and optimisation, but unfortunately I have the feeling my skills in Javascript are too limited to do anything useful in this area.
A piece of work Thaddée mentioned optimizing is the text width computation.
There are low-hanging fruits still in text width computation, if we do perf debugging, which would greatly increase max req/s.
— Thaddee Tyl (@espadrine) November 11, 2017
Is that something you might want to work on? I haven't dug in too much. I wonder if it's as simple as pre-computing the width of all the characters and looking them up later. There might be some kerning involved, really not sure. I would imagine this is more about creative problem solving than deep JavaScript.
If not… no worries at all! Your contributions are great, and I'm happy for you to keep finding things that feel in your wheelhouse.
Thanks for your encouraging words, I appreciate them! I will at least take a look at the text width computation and see whether I can come up with some improvements.
We need more capacity, or confidence we can buy more capacity, before we add more scrapers.
Our OpenCollective is now open, which means we can start to collect $10 donations toward this fund. If you wanted an answer to "what can you do to make this happen," it's to spread that message, far and wide.
Shields is still not currently implementing new badges that require scraping and parsing websites.
However we are offing a workaround: the Endpoint badge. You can use it with tools like RunKit and Jupyter Kernel Gateway to implement the exact badge content and logic you want, while letting Shields take care of formatting, rendering, and caching.
If you decide to use the new feature, your feedback is much appreciated! Please feel free to comment on the Endpoint beta issue if you have feedback or questions about how to use it.
Most helpful comment
Shields is still not currently implementing new badges that require scraping and parsing websites.
However we are offing a workaround: the Endpoint badge. You can use it with tools like RunKit and Jupyter Kernel Gateway to implement the exact badge content and logic you want, while letting Shields take care of formatting, rendering, and caching.
If you decide to use the new feature, your feedback is much appreciated! Please feel free to comment on the Endpoint beta issue if you have feedback or questions about how to use it.