Feature request and explanation Communities and search

Hey,

I attempted to make this extension myself, but with my limited knowledge of scripting, i unfortunately had problems with updating my database and pulling from it to create a search. So i figured I’d post it here in hopes that you guys could implement it and have better luck than I did. I’ll explain the layout and how i attempted to script it for how it should work to give you a good basis to go off of.

This is to better link communities and viewers, and allow people to find exactly the stream they’re looking for regardless of stream size or what game their playing. We’ll first start out with the broadcasters side of this feature.

Here we have the basic dashboard. I’ve injected a “Community” button in the [div] id=dash_nav [/div] section of the dashboard. Keeps the dashboard looking clean and uncluttered for when its not being used.

If clicked on, then it opens up the selection portion

As you can see, theres 5 dropdown menus, and a save button. The reason i chose 5 as the number of dropdowns, is so that broadcasters cant choose every single community even if they dont fit in it, just in order to get as much exposure as possible. By limiting it to 5, they have to choose ones that they actually fit into and best suits their stream.

Upon clicking one of them, a dropdown appears giving the possible communities to select from. I chose this route instead of allowing people to make their own groups is to keep the database clean and so that its easy to keep everyone grouped together for easy searches. Also it has the added benefit of not allowing people to add any…not so appropriate groups/tags.

Upon selecting the 5 they want, they can click the save button to update the database.

How i was scripting it was i created a var = broadcaster, which pulled and trimmed the url to just the name of the broadcaster. Also a var for each of the choices. As you can see in picture 3, whichever choice was selected updated the text of the button. I used this text as a variable to point to which form in the database should be updated. Each form in the database is only a single list of names. The database would consist of multiple forms. A form for each of the possible communities to select from. A form for speedrunners with a list of names etc etc.

Now lets get to the viewers side of things. First i injected a button into the left sidebar under games. As usual, trying to keep it clean looking and professional.

Upon clicking it, it opens a single scrollable list of the communities. These are buttons and nothing more that will open a modal.

If a user wants to find a new speedrunner, regardless of game or size, they can choose speedrunner. Did you know diablo 2 is a game that has a good sized speedrunning community? Click that button and lets see what streamers are speedrunning.

Ok so this is where i ran into problems and may or may not be photoshopped Kappa Also I have noone that is speedrunning any games atm >.< well bob ross is kinda speedrunning some paintings MiniK But lets just assume this list is populated with only streamers that are speedrunning.

So how i attempted and failed at scripting this portion, was by doing a repeated call function. Because our datatbase forms are only single string names, The community chosen in the second pic, would point to which form in our database to pull from, and have a function repeatedly go through twitchs GET api function using each name as a string to populate the modal.

Thanks for the read if you made it this far,
Sargasm

1 Like

The premise behind your idea is quite a good one. Curating channels based on community would generally lead to more organization given how many streams are now live on Twitch. I agree that the current directory from Twitch needs a lot more filters.

The problem behind your implementation is not the client UI, but more-so how we’d even go about curating the channels. Allow me to explain.

We browse channels that are live in the directory, not offline channels. Twitch’s only API endpoint for gathering streams lets you fetch up to 100 channels at a time. We would need to constantly poll their API for when streams go live. Now, BetterTTV has a massive user base and many people would want to add their channel to this curation list (for example, even with our emotes system there’s thousands of channels uploading and sharing emotes). It’s easy for us to apply filters on our own database to curate content, but when working with Twitch’s API we do not have such liberty. It would be extremely expensive to scale out an API to curate this content, since we would need to be constantly scraping Twitch’s API for thousands of channels and curating these lists.

As of this moment, there are over 20,000 live Twitch channels. Assuming more than that amount of channels adds their channel to a community for this curation, that means our most effective method of polling is to go through the 20,000 channels using offsets and gathering 100 at a time. This would mean that it takes at least 200 API calls to grab the full list of streams. Given that channels are constantly going live and offline, we’d need to scrape this endpoint frequently to be able to provide an up-to-date list of streamers. As Twitch grows, this would mean the number of requests we need to make also grows.

For these reasons, I don’t really think this is a scalable feature and I do not feel comfortable abusively scraping Twitch’s API. I know there are plenty of other services that scrape Twitch’s API, but I’ve never been a fan of abusing a free resource. Twitch doesn’t have rate limits on their API, even though they really should be rate limiting to prevent this type of abuse. There are already enough stability issues with their API without adding on even further load.

If Twitch ever releases a push-based API (like stream events over websocket), this would be something to consider only then (since we could locally keep a database of channels that are live in real-time and curate without scraping Twitch’s api).

Hmm was not aware that you could only call 100 at a time. With that then yeah it does kinda put a damper on things. Thanks for the reply night ^.^

~sargasm

What if we were to allow users to manually curate their own directory by adding categories and assigning channels to them. This could be stored locally and would be dependent on the user machine and not even touch the twitch API. I follow a huge amount of channels, I’d love to be able to sort them myself and be able to browse my directory more effectively.

Edit: Drag and Drop

1 Like

I do like that aswell. Sorting in folders is something i wish i had on my imgur account xD

As far calling to many times though, ive come up with 2 seperate ideas.

What if instead of calling on the api, we created our own bttv “profile”. We choose a profile pic, a 250 character description of our channel, and 5 tags. This would be set as a base point in a form in the database to pull from for the individual group forms. The group forms you could set on a timer to update every 4 hours or so.

Going this route wont have a “live” list of streamers but instead include offline aswell. Also would have to think of a way to order tge results. Streamer profiles get updated once a week perhaps and it gets sorted by highest follow count first?

Second possible route:
Since the main problem is calling on the api to many times, we could force a cooldown of sorts. We set up 2 databases (or 1 database with pre and post forms but for explination 2 databases) So 2 databases, a pre and a post api called. When the streamer updates their tags, the pre database gets updated and would look like the database originally explained in first post.

Every hour, 4 (or dif) of the database forms gets called on to be updated. Lets say speedrun group is 1 of yhe 6, the Script would take the names from the pre speedrun database, pull the information for them, and store this information in the post speedrun database. When a viewer searches for speedrun it would pull from the post database instead of going through the api.

looking at pull numbers: if we do 6 groups an hour, and 24 total groups, every group would be updated every 4 hours. This would greatly reduce the number of pulls using api and only have pull sets once every hour.

Further more to reduce it, we could add an “is updated” column in our pre database forms. If a user clicks the save button then that database column us checked. If at the time of that database update, only names in that form will be pulled to update. After the update and move to post database, all the names in the pre forms will be marked as “not updated”. and if at the time of the next pull time in 4h, it wont be pulled unless updated.

I hope i explained these thoughts clearly enough as to give a basis for possible solutions.

thanks again
~sarg

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.