So I have this script that extracts users liked videos and creates a public playlist containing all liked videos but since there's quota limits I can't add more than 200 videos per day (playlistItems().insert() costs 50 quota /10,000).
Is there any way I can add videos in batches of 50 or something to avoid running out of quota ?
my current code:
youtube = googleapiclient.discovery.build("youtube", "v3", credentials=credentials)#code for fetching video id's in a list...video_ids= [id1,id2,id3,id4,...]playlist_id= "<playlist id>"for id in video_ids: youtube.playlistItems().insert( part="snippet", body={"snippet": {"playlistId": playlist_id,"resourceId": {"kind": "youtube#video","videoId": id } } } ).execute()I've heard there's batch = youtube.new_batch_http_request() but this doesn't seem to work for me, adds only 1 video and I think this still costs same amount of quota as above code.
Edit : Found Another way for Above thanks to benjamin but now stuck with this function only returning 991 video id's but liked playlist have 1900 videos.
def get_song_ids(youtube:object): video_ids,songs_names,songs_ids = [],[],[] next_page_token = None while True: try: # Getting the IDs of the user's liked videos. liked_videos_response = youtube.videos().list( part="id,snippet,contentDetails,status", myRating="like", maxResults=50, pageToken=next_page_token ).execute() #filtering song id's and extracting other info for item in liked_videos_response["items"]: video_ids.append(item["id"]) title = item["snippet"]["title"] category_id = item["snippet"].get("categoryId") if category_id == "10": songs_names.append(title) songs_ids.append(item["id"]) # Checking if there's more videos in playlist next_page_token = liked_videos_response.get("nextPageToken") if not next_page_token: break except HttpError as error: print(error) print("Total: ",len(video_ids),len(songs_ids)) return video_ids,songs_ids