Compare commits

...

56 Commits

Author SHA1 Message Date
Benex254
1a72f88be3 docs: updaate readme 2024-08-22 18:31:23 +03:00
Benex254
1a9f1120b8 chore: bump version 2024-08-22 18:31:11 +03:00
Benex254
c2fc807688 feat: episode preview 2024-08-22 18:25:41 +03:00
Benex254
2b0ade093c feat: normalize anime titles 2024-08-22 17:32:53 +03:00
BeneX254
a26193706e Update README.md 2024-08-22 13:37:18 +03:00
BeneX254
ff3c57ef9b Update README.md 2024-08-22 13:31:53 +03:00
BeneX254
3b987bd07a Update README.md 2024-08-22 12:43:58 +03:00
BeneX254
e8474c0428 Update README.md 2024-08-22 12:37:43 +03:00
BeneX254
c78a759aa1 Update README.md 2024-08-22 00:38:46 +03:00
Benex254
d1aad70c48 feat: add awesome completions to search command 2024-08-21 23:49:39 +03:00
Benex254
62b36f3e58 fix: workaround over typing issue 2024-08-21 23:20:45 +03:00
Benex254
c5b905fb0d chore: update deps 2024-08-21 23:18:12 +03:00
Benex254
7d3dc671ed fix: workaround typing issue 2024-08-21 23:07:01 +03:00
Benex254
0ec3c7a5bb docs: update docs 2024-08-21 22:53:30 +03:00
Benex254
8e0619863a feat: search command 2024-08-21 22:53:18 +03:00
Benex254
e8a05ec4b8 feat: add dump json to anilist commands 2024-08-21 20:48:01 +03:00
Benex254
34e8b2abd1 feat: update download command 2024-08-21 19:45:57 +03:00
Benex254
161b6eb961 chore: bump version 2024-08-21 19:41:35 +03:00
Benex254
dd2090f85d docs: update 2024-08-21 19:41:01 +03:00
Benex254
8b1595a5da feat:update 2024-08-21 19:40:45 +03:00
Benex254
77ffa27ed8 chore: bump version 2024-08-21 17:37:09 +03:00
Benex254
15f79b65c9 feat: aniwave?? 2024-08-21 17:18:30 +03:00
Benex254
33c3af0241 chore: remove print and input statements 2024-08-21 16:00:52 +03:00
Benex254
9badde62fb feat: improve providers 2024-08-21 15:58:01 +03:00
Benex254
4e401dca40 fix: logging issue 2024-08-21 14:53:30 +03:00
Benex254
25422b1b7d feat: improve aniwatch provider api 2024-08-21 14:52:56 +03:00
Benex254
e8463f13b4 chore: reconfigure pyright 2024-08-21 11:42:48 +03:00
Benex254
556f42e41f fix: clean option of download command 2024-08-21 11:41:55 +03:00
Benex254
b99a4f7efc chore: bump version 2024-08-19 23:44:05 +03:00
Benex254
f6f45cf322 docs: update readme 2024-08-19 23:43:50 +03:00
Benex254
ae6db1847a feat: improve download functionality 2024-08-19 23:43:34 +03:00
Benex254
20d04ea07b feat(utils): add m3u8 quality selector 2024-08-19 17:27:52 +03:00
Benex254
8f3834453c chore: bump version 2024-08-19 15:28:04 +03:00
Benex254
7ad8b8a0e3 fix: return values 2024-08-19 15:25:36 +03:00
Benex254
80b41f06da feat:add new ui command 2024-08-19 15:25:05 +03:00
Benex254
e79321ed50 chore: bump version 2024-08-19 13:05:03 +03:00
Benex254
f7b5898dfa fix: some stuff 2024-08-19 13:04:30 +03:00
Benex254
144bf53081 chore: bump version 2024-08-19 11:01:13 +03:00
Benex254
16dded9724 fix: inability to properly detect terminal 2024-08-19 10:51:39 +03:00
Benex254
c47b158bff fix: logging issue 2024-08-19 10:51:11 +03:00
Benex254
9a36e15d9d feat: intergrate subs to python-mpv based player 2024-08-19 10:37:04 +03:00
Benex254
d6b2bd7761 fix: ep title 2024-08-19 10:36:20 +03:00
Benex254
2346552dc4 fix: logging issue 2024-08-19 00:38:51 +03:00
Benex254
ba275055db fix: logging issue 2024-08-19 00:38:29 +03:00
Benex254
de4ddf2f3a chore: bump version 2024-08-19 00:21:48 +03:00
Benex254
9c94d824d1 fix: rearrange servers available 2024-08-19 00:21:16 +03:00
Benex254
495f3cfbf6 chore: bump version 2024-08-18 23:59:30 +03:00
Benex254
b56c9ae3dd docs: update reamde 2024-08-18 23:59:16 +03:00
Benex254
5e9ef87526 feat: improve provider api 2024-08-18 23:55:29 +03:00
Benex254
b68d6d6fe9 feat: accomodate subtitle streams 2024-08-18 23:54:59 +03:00
Benex254
5870cc6640 feat: accomodate subtitle streams 2024-08-18 23:54:36 +03:00
Benex254
7a43d58d82 fix: command order 2024-08-18 23:54:16 +03:00
Benex254
fc7efebc8d feat: accomodate subtitle streams 2024-08-18 23:53:36 +03:00
Benex254
528be74194 feat(aniwatch): init 2024-08-18 23:52:18 +03:00
Benex254
ab782acf2f chore: bump version 2024-08-18 15:47:44 +03:00
Benex254
45836d1ebc fix: handle no matches for search results 2024-08-18 15:47:29 +03:00
59 changed files with 2568 additions and 542 deletions

105
README.md
View File

@@ -2,11 +2,14 @@
Welcome to **FastAnime**, anime site experience from the terminal. Welcome to **FastAnime**, anime site experience from the terminal.
**fzf mode** ![fastanime-demo](https://github.com/user-attachments/assets/16e29f54-e9fa-48c7-b944-bfacb31ae1b5)
<details>
<summary><b>fzf mode</b></summary>
[fa_fzf_demo.webm](https://github.com/user-attachments/assets/b1fecf25-e358-4e8b-a144-bcb7947210cf) [fa_fzf_demo.webm](https://github.com/user-attachments/assets/b1fecf25-e358-4e8b-a144-bcb7947210cf)
**other modes:** </details>
<details> <details>
<summary><b>rofi mode</b></summary> <summary><b>rofi mode</b></summary>
@@ -51,7 +54,6 @@ Heavily inspired by [animdl](https://github.com/justfoolingaround/animdl), [magi
- [Key Bindings](#key-bindings) - [Key Bindings](#key-bindings)
- [Script Messages](#script-messages) - [Script Messages](#script-messages)
- [Configuration](#configuration) - [Configuration](#configuration)
- [The python api](#the-python-api)
- [Contributing](#contributing) - [Contributing](#contributing)
- [Receiving Support](#receiving-support) - [Receiving Support](#receiving-support)
- [Supporting the Project](#supporting-the-project) - [Supporting the Project](#supporting-the-project)
@@ -59,7 +61,7 @@ Heavily inspired by [animdl](https://github.com/justfoolingaround/animdl), [magi
> [!IMPORTANT] > [!IMPORTANT]
> >
> This project currently scrapes allanime and animepahe. The site is in the public domain and can be accessed by any one with a browser. > This project currently scrapes allanime, aniwatch and animepahe. The site is in the public domain and can be accessed by any one with a browser.
## Installation ## Installation
@@ -179,6 +181,7 @@ The only required external dependency, unless you won't be streaming, is [MPV](h
## Usage ## Usage
The project offers a featureful command-line interface and MPV interface through the use of python-mpv. The project offers a featureful command-line interface and MPV interface through the use of python-mpv.
The project also offers subs in different languages thanks to aniwatch provider.
### The Commandline interface :fire: ### The Commandline interface :fire:
@@ -221,7 +224,7 @@ Available options for the fastanime include:
- `--default` use the default ui - `--default` use the default ui
- `--preview` show a preview when using fzf - `--preview` show a preview when using fzf
- `--no-preview` dont show a preview when using fzf - `--no-preview` dont show a preview when using fzf
- `--format <yt-dlp format string>` or `-f <yt-dlp format string>` set the format of anime downloaded and streamed based on yt-dlp format. Works when `--server gogoanime` - `--format <yt-dlp format string>` or `-f <yt-dlp format string>` set the format of anime downloaded and streamed based on [yt-dlp format](https://github.com/yt-dlp/yt-dlp#format-selection). Works when `--server gogoanime` or on providers that provide multi quality streams eg aniwatch
- `--icons/--no-icons` toggle the visibility of the icons - `--icons/--no-icons` toggle the visibility of the icons
- `--skip/--no-skip` whether to skip the opening and ending theme songs. - `--skip/--no-skip` whether to skip the opening and ending theme songs.
- `--rofi` use rofi for the ui - `--rofi` use rofi for the ui
@@ -234,6 +237,8 @@ Available options for the fastanime include:
- `--use-mpv-mod/--use-default-player` whether to use python-mpv - `--use-mpv-mod/--use-default-player` whether to use python-mpv
- `--provider <allanime/animepahe>` anime site of choice to scrape from - `--provider <allanime/animepahe>` anime site of choice to scrape from
- `--sync-play` or `-sp` use syncplay for streaming anime so you can watch with your friends - `--sync-play` or `-sp` use syncplay for streaming anime so you can watch with your friends
- `--sub-lang <en/or any other common shortform for country>` regex is used to determine the appropriate. Only works when provider is aniwatch.
- `--normalize-titles/--no-normalize-titles` whether to normalize provider titles
Example usage of the above options Example usage of the above options
@@ -267,6 +272,7 @@ Run `fastanime anilist` to access the main interface.
##### Subcommands ##### Subcommands
The subcommands are mainly their as convenience. Since all the features already exist in the main interface. The subcommands are mainly their as convenience. Since all the features already exist in the main interface.
Most of the subcommands share the common option `--dump-json` or `-d` which will print only the json data and suppress the ui.
- `fastanime anilist trending`: Top 15 trending anime. - `fastanime anilist trending`: Top 15 trending anime.
- `fastanime anilist recent`: Top 15 recently updated anime. - `fastanime anilist recent`: Top 15 recently updated anime.
@@ -276,6 +282,46 @@ The subcommands are mainly their as convenience. Since all the features already
- `fastanime anilist favourites`: Top 15 favorite anime. - `fastanime anilist favourites`: Top 15 favorite anime.
- `fastanime anilist random`: get random anime - `fastanime anilist random`: get random anime
**FastAnime Anilist Search subcommand** 🔥 🔥 🔥
It is by far one of the most powerful commands.
It offers the following options:
- `--sort <MediaSort>` or `-s <MediaSort>`
- `--title <anime-title>` or `-t <anime-title>`
- `--tags <tag>` or `-T <tag>` can be specified multiple times for different tags to filter by.
- `--year <year>` or `-y <year>`
- `--status <MediaStatus>` or `-S <MediaStatus>`
- `--media-format <MediaFormat>` or `-f <MediaFormat>`
- `--season <MediaSeason>`
- `--genres <genre>` or `-g <genre>` can be specified multiple times.
Example:
```bash
# get anime with the tag of isekai
fastanime anilist search -T isekai
# get anime of 2024 and sort by popularity
fastanime anilist search -y 2024 -s POPULARITY_DESC
# get anime of 2024 season WINTER
fastanime anilist search -y 2024 --season WINTER
# get anime genre action and tag isekai,magic
fastanime anilist search -g Action -T Isekai -T Magic
# get anime of 2024 thats finished airing
fastanime anilist search -y 2024 -S FINISHED
# get the most favourite anime movies
fastanime anilist search -f MOVIE -s FAVOURITES_DESC
```
For more details visit the anilist docs or just get the completions which will improve the experience.
Like seriously **[get the completions](https://github.com/Benex254/FastAnime#completions-subcommand)** and the experience will be a 💯 💯 better.
The following are commands you can only run if you are signed in to your AniList account: The following are commands you can only run if you are signed in to your AniList account:
- `fastanime anilist watching` - `fastanime anilist watching`
@@ -285,7 +331,7 @@ The following are commands you can only run if you are signed in to your AniList
- `fastanime anilist paused` - `fastanime anilist paused`
- `fastanime anilist completed` - `fastanime anilist completed`
Plus: `fastanime anilist notifier` :fire: Plus: `fastanime anilist notifier` 🔥
```bash ```bash
# basic form # basic form
@@ -362,6 +408,24 @@ fastanime download -t <anime-title> -r ':<episodes-end>'
# remember python indexing starts at 0 # remember python indexing starts at 0
fastanime download -t <anime-title> -r '<episode-1>:<episode>' fastanime download -t <anime-title> -r '<episode-1>:<episode>'
# merge subtitles with ffmpeg to mkv format; aniwatch tends to give subs as separate files
# and dont prompt for anything
# eg existing file in destination instead remove
# and clean
# ie remove original files (sub file and vid file)
# only keep merged files
fastanime download -t <anime-title> --merge --clean --no-prompt
# EOF is used since -t always expects a title
# you can supply anime titles from file or -t at the same time
#
# from stdin
echo -e "<anime-title>\n<anime-title>\n<anime-title>" | fastanime download -t "EOF" -r <range> -f -
# from file
fastanime download -t "EOF" -r <range> -f <file-path>
``` ```
#### search subcommand #### search subcommand
@@ -579,6 +643,10 @@ script-message select-server <server-name>
script-message select-quality <1080/720/480/360> script-message select-quality <1080/720/480/360>
``` ```
## styling the default interface
The default interface uses inquirerPy which is customizable. Read here to findout more <https://inquirerpy.readthedocs.io/en/latest/pages/env.html>
## Configuration ## Configuration
The app includes sensible defaults but can be customized extensively. Configuration is stored in `.ini` format at `~/.config/FastAnime/config.ini` on arch linux; for the other operating systems you can check by running `fastanime config --path`. The app includes sensible defaults but can be customized extensively. Configuration is stored in `.ini` format at `~/.config/FastAnime/config.ini` on arch linux; for the other operating systems you can check by running `fastanime config --path`.
@@ -612,6 +680,7 @@ skip=false
# used in the continue from time stamp # used in the continue from time stamp
error=3 error=3
# whether to use python mpv for enhanced experience
use_mpv_mod=False use_mpv_mod=False
# the format of downloaded anime and trailer # the format of downloaded anime and trailer
@@ -628,6 +697,8 @@ provider = allanime
preferred_language = romaji # Display language (options: english, romaji) preferred_language = romaji # Display language (options: english, romaji)
normalize_titles = true
downloads_dir = <Default-videos-dir>/FastAnime # Download directory downloads_dir = <Default-videos-dir>/FastAnime # Download directory
preview=false # whether to show a preview window when using fzf or rofi preview=false # whether to show a preview window when using fzf or rofi
@@ -654,28 +725,6 @@ notification_duration=2
# Not implemented yet # Not implemented yet
``` ```
## The python api
The project offers a python api that can be used in other python programs.
```python
from fastanime.AnimeProvider import AnimeProvider
# all output is typed, so will be easy to work with
# providers include [allanime, animepahe]
provider = AnimeProvider(provider="allanime")
# to search for anime
provider.search_for_anime()
# to get anime info
provider.get_anime()
# to get streams of an episode
provider.get_episode_streams()
```
## Contributing ## Contributing
We welcome your issues and feature requests. However, due to time constraints, we currently do not plan to add another provider. We welcome your issues and feature requests. However, due to time constraints, we currently do not plan to add another provider.

View File

@@ -37,12 +37,12 @@ class AnimeProvider:
self.provider = provider self.provider = provider
self.dynamic = dynamic self.dynamic = dynamic
self.retries = retries self.retries = retries
self.lazyload_provider() self.lazyload_provider(self.provider)
def lazyload_provider(self): def lazyload_provider(self, provider):
"""updates the current provider being used""" """updates the current provider being used"""
_, anime_provider_cls_name = anime_sources[self.provider].split(".", 1) _, anime_provider_cls_name = anime_sources[provider].split(".", 1)
package = f"fastanime.libs.anime_provider.{self.provider}" package = f"fastanime.libs.anime_provider.{provider}"
provider_api = importlib.import_module(".api", package) provider_api = importlib.import_module(".api", package)
anime_provider = getattr(provider_api, anime_provider_cls_name) anime_provider = getattr(provider_api, anime_provider_cls_name)
self.anime_provider = anime_provider() self.anime_provider = anime_provider()
@@ -73,7 +73,7 @@ class AnimeProvider:
user_query, translation_type, nsfw, unknown user_query, translation_type, nsfw, unknown
) )
except Exception as e: except Exception as e:
logging.error(e) logger.error(e)
results = None results = None
return results return results
@@ -95,7 +95,7 @@ class AnimeProvider:
try: try:
results = anime_provider.get_anime(anime_id) results = anime_provider.get_anime(anime_id)
except Exception as e: except Exception as e:
logging.error(e) logger.error(e)
results = None results = None
return results return results
@@ -123,6 +123,6 @@ class AnimeProvider:
anime, episode, translation_type anime, episode, translation_type
) )
except Exception as e: except Exception as e:
logging.error(e) logger.error(e)
results = None results = None
return results # pyright:ignore return results # pyright:ignore

View File

@@ -9,6 +9,3 @@ anime_normalizer = {
"Dungeon ni Deai o Motomeru no wa Machigatte Iru Darouka": "Dungeon ni Deai wo Motomeru no wa Machigatteiru Darou ka", "Dungeon ni Deai o Motomeru no wa Machigatte Iru Darouka": "Dungeon ni Deai wo Motomeru no wa Machigatteiru Darou ka",
'Hazurewaku no "Joutai Ijou Skill" de Saikyou ni Natta Ore ga Subete wo Juurin suru made': "Hazure Waku no [Joutai Ijou Skill] de Saikyou ni Natta Ore ga Subete wo Juurin Suru made", 'Hazurewaku no "Joutai Ijou Skill" de Saikyou ni Natta Ore ga Subete wo Juurin suru made': "Hazure Waku no [Joutai Ijou Skill] de Saikyou ni Natta Ore ga Subete wo Juurin Suru made",
} }
anilist_sort_normalizer = {"search match": "SEARCH_MATCH"}

View File

@@ -1,8 +1,14 @@
import logging import logging
import os
import shutil
import subprocess
import tempfile
from queue import Queue from queue import Queue
from threading import Thread from threading import Thread
import yt_dlp import yt_dlp
from rich import print
from rich.prompt import Confirm
from yt_dlp.utils import sanitize_filename from yt_dlp.utils import sanitize_filename
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -38,6 +44,10 @@ class YtDLPDownloader:
force_unknown_ext=False, force_unknown_ext=False,
verbose=False, verbose=False,
headers={}, headers={},
sub="",
merge=False,
clean=False,
prompt=True,
): ):
"""Helper function that downloads anime given url and path details """Helper function that downloads anime given url and path details
@@ -60,9 +70,85 @@ class YtDLPDownloader:
"format": vid_format, "format": vid_format,
"compat_opts": ("allow-unsafe-ext",) if force_unknown_ext else tuple(), "compat_opts": ("allow-unsafe-ext",) if force_unknown_ext else tuple(),
} }
urls = [url]
if sub:
urls.append(sub)
vid_path = ""
sub_path = ""
for i, url in enumerate(urls):
with yt_dlp.YoutubeDL(ydl_opts) as ydl:
info = ydl.extract_info(url, download=True)
if not info:
continue
if i == 0:
vid_path = info["requested_downloads"][0]["filepath"]
else:
sub_path = info["requested_downloads"][0]["filepath"]
if sub_path and vid_path and merge:
self.merge_subtitles(vid_path, sub_path, clean, prompt)
with yt_dlp.YoutubeDL(ydl_opts) as ydl: def merge_subtitles(self, video_path, sub_path, clean, prompt):
ydl.download([url]) # Extract the directory and filename
video_dir = os.path.dirname(video_path)
video_name = os.path.basename(video_path)
video_name, _ = os.path.splitext(video_name)
video_name += ".mkv"
FFMPEG_EXECUTABLE = shutil.which("ffmpeg")
if not FFMPEG_EXECUTABLE:
print("[yellow bold]WARNING: [/]FFmpeg not found")
return
# Create a temporary directory
with tempfile.TemporaryDirectory() as temp_dir:
# Temporary output path in the temporary directory
temp_output_path = os.path.join(temp_dir, video_name)
# FFmpeg command to merge subtitles
command = [
FFMPEG_EXECUTABLE,
"-hide_banner",
"-i",
video_path,
"-i",
sub_path,
"-c",
"copy",
"-map",
"0",
"-map",
"1",
temp_output_path,
]
# Run the command
try:
subprocess.run(command, check=True)
# Move the file back to the original directory with the original name
final_output_path = os.path.join(video_dir, video_name)
if os.path.exists(final_output_path):
if not prompt or Confirm.ask(
f"File exists({final_output_path}) would you like to overwrite it",
default=True,
):
# move file to dest
os.remove(final_output_path)
shutil.move(temp_output_path, final_output_path)
else:
shutil.move(temp_output_path, final_output_path)
# clean up
if clean:
print("[cyan]Cleaning original files...[/]")
os.remove(video_path)
os.remove(sub_path)
print(
f"[green bold]Subtitles merged successfully.[/] Output file: {final_output_path}"
)
except subprocess.CalledProcessError as e:
print(f"[red bold]Error[/] during merging subtitles: {e}")
except Exception as e:
print(f"[red bold]An error[/] occurred: {e}")
# WARN: May remove this legacy functionality # WARN: May remove this legacy functionality
def download_file(self, url: str, title, silent=True): def download_file(self, url: str, title, silent=True):

View File

@@ -6,7 +6,7 @@ if sys.version_info < (3, 10):
) # noqa: F541 ) # noqa: F541
__version__ = "v2.2.5" __version__ = "v2.4.0"
APP_NAME = "FastAnime" APP_NAME = "FastAnime"
AUTHOR = "Benex254" AUTHOR = "Benex254"

View File

@@ -4,7 +4,6 @@ import click
from .. import __version__ from .. import __version__
from ..libs.anime_provider import SERVERS_AVAILABLE, anime_sources from ..libs.anime_provider import SERVERS_AVAILABLE, anime_sources
from ..Utility.data import anilist_sort_normalizer
from .commands import LazyGroup from .commands import LazyGroup
commands = { commands = {
@@ -98,6 +97,11 @@ signal.signal(signal.SIGINT, handle_exit)
type=click.Choice(["dub", "sub"]), type=click.Choice(["dub", "sub"]),
help="Anime language[dub/sub]", help="Anime language[dub/sub]",
) )
@click.option(
"-sl",
"--sub-lang",
help="Set the preferred language for subs",
)
@click.option( @click.option(
"-A/-no-A", "-A/-no-A",
"--auto-next/--no-auto-next", "--auto-next/--no-auto-next",
@@ -111,9 +115,9 @@ signal.signal(signal.SIGINT, handle_exit)
help="Auto select anime title?", help="Auto select anime title?",
) )
@click.option( @click.option(
"-S", "--normalize-titles/--no-normalize-titles",
"--sort-by", type=bool,
type=click.Choice(anilist_sort_normalizer.keys()), # pyright: ignore help="whether to normalize anime and episode titls given by providers",
) )
@click.option("-d", "--downloads-dir", type=click.Path(), help="Downloads location") @click.option("-d", "--downloads-dir", type=click.Path(), help="Downloads location")
@click.option("--fzf", is_flag=True, help="Use fzf for the ui") @click.option("--fzf", is_flag=True, help="Use fzf for the ui")
@@ -156,10 +160,11 @@ def run_cli(
local_history, local_history,
skip, skip,
translation_type, translation_type,
sub_lang,
quality, quality,
auto_next, auto_next,
auto_select, auto_select,
sort_by, normalize_titles,
downloads_dir, downloads_dir,
fzf, fzf,
default, default,
@@ -186,7 +191,7 @@ def run_cli(
FORMAT = "%(message)s" FORMAT = "%(message)s"
logging.basicConfig( logging.basicConfig(
level="NOTSET", format=FORMAT, datefmt="[%X]", handlers=[RichHandler()] level=logging.DEBUG, format=FORMAT, datefmt="[%X]", handlers=[RichHandler()]
) )
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
logger.info("logging has been initialized") logger.info("logging has been initialized")
@@ -203,6 +208,10 @@ def run_cli(
datefmt="[%d/%m/%Y@%H:%M:%S]", datefmt="[%d/%m/%Y@%H:%M:%S]",
filemode="w", filemode="w",
) )
else:
import logging
logging.basicConfig(level=logging.CRITICAL)
if rich_traceback: if rich_traceback:
from rich.traceback import install from rich.traceback import install
@@ -216,10 +225,17 @@ def run_cli(
ctx.obj.server = server ctx.obj.server = server
if format: if format:
ctx.obj.format = format ctx.obj.format = format
if sub_lang:
ctx.obj.sub_lang = sub_lang
if ctx.get_parameter_source("continue_") == click.core.ParameterSource.COMMANDLINE: if ctx.get_parameter_source("continue_") == click.core.ParameterSource.COMMANDLINE:
ctx.obj.continue_from_history = continue_ ctx.obj.continue_from_history = continue_
if ctx.get_parameter_source("skip") == click.core.ParameterSource.COMMANDLINE: if ctx.get_parameter_source("skip") == click.core.ParameterSource.COMMANDLINE:
ctx.obj.skip = skip ctx.obj.skip = skip
if (
ctx.get_parameter_source("normalize_titles")
== click.core.ParameterSource.COMMANDLINE
):
ctx.obj.normalize_titles = normalize_titles
if quality: if quality:
ctx.obj.quality = quality ctx.obj.quality = quality
@@ -242,8 +258,6 @@ def run_cli(
== click.core.ParameterSource.COMMANDLINE == click.core.ParameterSource.COMMANDLINE
): ):
ctx.obj.use_mpv_mod = use_mpv_mod ctx.obj.use_mpv_mod = use_mpv_mod
if sort_by:
ctx.obj.sort_by = sort_by
if downloads_dir: if downloads_dir:
ctx.obj.downloads_dir = downloads_dir ctx.obj.downloads_dir = downloads_dir
if translation_type: if translation_type:

View File

@@ -7,16 +7,23 @@ if TYPE_CHECKING:
@click.command(help="View anime you completed") @click.command(help="View anime you completed")
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def completed(config: "Config"): def completed(config: "Config", dump_json):
from sys import exit
from ....anilist import AniList from ....anilist import AniList
from ...interfaces import anilist_interfaces from ...utils.tools import FastAnimeRuntimeState
from ...utils.tools import FastAnimeRuntimeState, exit_app
if not config.user: if not config.user:
print("Not authenticated") print("Not authenticated")
print("Please run: fastanime anilist loggin") print("Please run: fastanime anilist loggin")
exit_app() exit(1)
anime_list = AniList.get_anime_list("COMPLETED") anime_list = AniList.get_anime_list("COMPLETED")
if not anime_list or not anime_list[1]: if not anime_list or not anime_list[1]:
return return
@@ -27,6 +34,13 @@ def completed(config: "Config"):
for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"] for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"]
] # pyright:ignore ] # pyright:ignore
anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = anime_list[1] import json
anilist_interfaces.anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(anime_list))
else:
from ...interfaces import anilist_interfaces
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = anime_list[1]
anilist_interfaces.anilist_results_menu(config, fastanime_runtime_state)

View File

@@ -7,26 +7,40 @@ if TYPE_CHECKING:
@click.command(help="View anime you dropped") @click.command(help="View anime you dropped")
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def dropped(config: "Config"): def dropped(config: "Config", dump_json):
from sys import exit
from ....anilist import AniList from ....anilist import AniList
from ...interfaces import anilist_interfaces
from ...utils.tools import FastAnimeRuntimeState, exit_app
if not config.user: if not config.user:
print("Not authenticated") print("Not authenticated")
print("Please run: fastanime anilist loggin") print("Please run: fastanime anilist loggin")
exit_app() exit(1)
anime_list = AniList.get_anime_list("DROPPED") anime_list = AniList.get_anime_list("DROPPED")
if not anime_list: if not anime_list:
return exit(1)
if not anime_list[0] or not anime_list[1]: if not anime_list[0] or not anime_list[1]:
return exit(1)
media = [ media = [
mediaListItem["media"] mediaListItem["media"]
for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"] for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"]
] # pyright:ignore ] # pyright:ignore
anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = anime_list[1] import json
anilist_interfaces.anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(anime_list[1]))
else:
from ...interfaces import anilist_interfaces
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = anime_list[1]
anilist_interfaces.anilist_results_menu(config, fastanime_runtime_state)

View File

@@ -5,14 +5,30 @@ import click
help="Fetch the top 15 most favourited anime from anilist", help="Fetch the top 15 most favourited anime from anilist",
short_help="View most favourited anime", short_help="View most favourited anime",
) )
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def favourites(config): def favourites(config, dump_json):
from ....anilist import AniList from ....anilist import AniList
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
anime_data = AniList.get_most_favourite() anime_data = AniList.get_most_favourite()
if anime_data[0]: if anime_data[0]:
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = anime_data[1] import json
anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(anime_data[1]))
else:
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = anime_data[1]
anilist_results_menu(config, fastanime_runtime_state)
else:
from sys import exit
exit(1)

View File

@@ -11,11 +11,11 @@ if TYPE_CHECKING:
@click.option("--erase", "-e", help="Erase your login details", is_flag=True) @click.option("--erase", "-e", help="Erase your login details", is_flag=True)
@click.pass_obj @click.pass_obj
def login(config: "Config", status, erase): def login(config: "Config", status, erase):
from sys import exit
from rich import print from rich import print
from rich.prompt import Confirm, Prompt from rich.prompt import Confirm, Prompt
from ...utils.tools import exit_app
if status: if status:
is_logged_in = True if config.user else False is_logged_in = True if config.user else False
message = ( message = (
@@ -23,16 +23,16 @@ def login(config: "Config", status, erase):
) )
print(message) print(message)
print(config.user) print(config.user)
exit_app() exit(0)
elif erase: elif erase:
if Confirm.ask( if Confirm.ask(
"Are you sure you want to erase your login status", default=False "Are you sure you want to erase your login status", default=False
): ):
config.update_user({}) config.update_user({})
print("Success") print("Success")
exit_app(0) exit(0)
else: else:
exit_app(1) exit(1)
else: else:
from click import launch from click import launch
@@ -41,7 +41,7 @@ def login(config: "Config", status, erase):
if config.user: if config.user:
print("Already logged in :confused:") print("Already logged in :confused:")
if not Confirm.ask("or would you like to reloggin", default=True): if not Confirm.ask("or would you like to reloggin", default=True):
exit_app() exit(0)
# ---- new loggin ----- # ---- new loggin -----
print( print(
f"A browser session will be opened ( [link]{config.fastanime_anilist_app_login_url}[/link] )", f"A browser session will be opened ( [link]{config.fastanime_anilist_app_login_url}[/link] )",
@@ -52,10 +52,10 @@ def login(config: "Config", status, erase):
user = AniList.login_user(token) user = AniList.login_user(token)
if not user: if not user:
print("Sth went wrong", user) print("Sth went wrong", user)
exit_app() exit(1)
return return
user["token"] = token user["token"] = token
config.update_user(user) config.update_user(user)
print("Successfully saved credentials") print("Successfully saved credentials")
print(user) print(user)
exit_app() exit(0)

View File

@@ -13,6 +13,7 @@ def notifier(config: "Config"):
import logging import logging
import os import os
import time import time
from sys import exit
import requests import requests
from plyer import notification from plyer import notification
@@ -30,7 +31,7 @@ def notifier(config: "Config"):
if not config.user: if not config.user:
print("Not Authenticated") print("Not Authenticated")
print("Run the following to get started: fastanime anilist loggin") print("Run the following to get started: fastanime anilist loggin")
return exit(1)
run = True run = True
# WARNING: Mess around with this value at your own risk # WARNING: Mess around with this value at your own risk
timeout = 2 # time is in minutes timeout = 2 # time is in minutes

View File

@@ -7,26 +7,40 @@ if TYPE_CHECKING:
@click.command(help="View anime you paused on watching") @click.command(help="View anime you paused on watching")
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def paused(config: "Config"): def paused(config: "Config", dump_json):
from sys import exit
from ....anilist import AniList from ....anilist import AniList
from ...interfaces import anilist_interfaces
from ...utils.tools import FastAnimeRuntimeState, exit_app
if not config.user: if not config.user:
print("Not authenticated") print("Not authenticated")
print("Please run: fastanime anilist loggin") print("Please run: fastanime anilist loggin")
exit_app() exit(1)
anime_list = AniList.get_anime_list("PAUSED") anime_list = AniList.get_anime_list("PAUSED")
if not anime_list: if not anime_list:
return exit(1)
if not anime_list[0] or not anime_list[1]: if not anime_list[0] or not anime_list[1]:
return exit(1)
media = [ media = [
mediaListItem["media"] mediaListItem["media"]
for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"] for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"]
] # pyright:ignore ] # pyright:ignore
anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore
anilist_config = FastAnimeRuntimeState() if dump_json:
anilist_config.data = anime_list[1] import json
anilist_interfaces.anilist_results_menu(config, anilist_config)
print(json.dumps(anime_list[1]))
else:
from ...interfaces import anilist_interfaces
from ...utils.tools import FastAnimeRuntimeState
anilist_config = FastAnimeRuntimeState()
anilist_config.data = anime_list[1]
anilist_interfaces.anilist_results_menu(config, anilist_config)

View File

@@ -7,26 +7,40 @@ if TYPE_CHECKING:
@click.command(help="View anime you are planning on watching") @click.command(help="View anime you are planning on watching")
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def planning(config: "Config"): def planning(config: "Config", dump_json):
from sys import exit
from ....anilist import AniList from ....anilist import AniList
from ...interfaces import anilist_interfaces
from ...utils.tools import FastAnimeRuntimeState, exit_app
if not config.user: if not config.user:
print("Not authenticated") print("Not authenticated")
print("Please run: fastanime anilist loggin") print("Please run: fastanime anilist loggin")
exit_app() exit(1)
anime_list = AniList.get_anime_list("PLANNING") anime_list = AniList.get_anime_list("PLANNING")
if not anime_list: if not anime_list:
return exit(1)
if not anime_list[0] or not anime_list[1]: if not anime_list[0] or not anime_list[1]:
return exit(1)
media = [ media = [
mediaListItem["media"] mediaListItem["media"]
for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"] for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"]
] # pyright:ignore ] # pyright:ignore
anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = anime_list[1] import json
anilist_interfaces.anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(anime_list[1]))
else:
from ...interfaces import anilist_interfaces
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = anime_list[1]
anilist_interfaces.anilist_results_menu(config, fastanime_runtime_state)

View File

@@ -4,14 +4,30 @@ import click
@click.command( @click.command(
help="Fetch the top 15 most popular anime", short_help="View most popular anime" help="Fetch the top 15 most popular anime", short_help="View most popular anime"
) )
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def popular(config): def popular(config, dump_json):
from ....anilist import AniList from ....anilist import AniList
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
anime_data = AniList.get_most_popular() anime_data = AniList.get_most_popular()
if anime_data[0]: if anime_data[0]:
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = anime_data[1] import json
anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(anime_data[1]))
else:
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = anime_data[1]
anilist_results_menu(config, fastanime_runtime_state)
else:
from sys import exit
exit(1)

View File

@@ -5,23 +5,35 @@ import click
help="Get random anime from anilist based on a range of anilist anime ids that are seected at random", help="Get random anime from anilist based on a range of anilist anime ids that are seected at random",
short_help="View random anime", short_help="View random anime",
) )
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def random_anime(config): def random_anime(config, dump_json):
import random import random
from ....anilist import AniList from ....anilist import AniList
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
random_anime = range(1, 15000) random_anime = range(1, 100000)
random_anime = random.sample(random_anime, k=50) random_anime = random.sample(random_anime, k=50)
anime_data = AniList.search(id_in=list(random_anime)) anime_data = AniList.search(id_in=list(random_anime))
if anime_data[0]: if anime_data[0]:
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = anime_data[1] import json
anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(anime_data[1]))
else:
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = anime_data[1]
anilist_results_menu(config, fastanime_runtime_state)
else: else:
print(anime_data[1]) exit(1)

View File

@@ -5,14 +5,30 @@ import click
help="Fetch the 15 most recently updated anime from anilist that are currently releasing", help="Fetch the 15 most recently updated anime from anilist that are currently releasing",
short_help="View recently updated anime", short_help="View recently updated anime",
) )
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def recent(config): def recent(config, dump_json):
from ....anilist import AniList from ....anilist import AniList
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
anime_data = AniList.get_most_recently_updated() anime_data = AniList.get_most_recently_updated()
if anime_data[0]: if anime_data[0]:
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = anime_data[1] import json
anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(anime_data[1]))
else:
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = anime_data[1]
anilist_results_menu(config, fastanime_runtime_state)
else:
from sys import exit
exit(1)

View File

@@ -7,26 +7,40 @@ if TYPE_CHECKING:
@click.command(help="View anime you are rewatching") @click.command(help="View anime you are rewatching")
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def rewatching(config: "Config"): def rewatching(config: "Config", dump_json):
from sys import exit
from ....anilist import AniList from ....anilist import AniList
from ...interfaces import anilist_interfaces
from ...utils.tools import FastAnimeRuntimeState, exit_app
if not config.user: if not config.user:
print("Not authenticated") print("Not authenticated")
print("Please run: fastanime anilist loggin") print("Please run: fastanime anilist loggin")
exit_app() exit(1)
anime_list = AniList.get_anime_list("REPEATING") anime_list = AniList.get_anime_list("REPEATING")
if not anime_list: if not anime_list:
return exit(1)
if not anime_list[0] or not anime_list[1]: if not anime_list[0] or not anime_list[1]:
return exit(1)
media = [ media = [
mediaListItem["media"] mediaListItem["media"]
for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"] for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"]
] # pyright:ignore ] # pyright:ignore
anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = anime_list[1] import json
anilist_interfaces.anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(anime_list[1]))
else:
from ...interfaces import anilist_interfaces
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = anime_list[1]
anilist_interfaces.anilist_results_menu(config, fastanime_runtime_state)

View File

@@ -4,14 +4,30 @@ import click
@click.command( @click.command(
help="Fetch the 15 most scored anime", short_help="View most scored anime" help="Fetch the 15 most scored anime", short_help="View most scored anime"
) )
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def scores(config): def scores(config, dump_json):
from ....anilist import AniList from ....anilist import AniList
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
anime_data = AniList.get_most_scored() anime_data = AniList.get_most_scored()
if anime_data[0]: if anime_data[0]:
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.data = anime_data[1] import json
anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(anime_data[1]))
else:
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.data = anime_data[1]
anilist_results_menu(config, fastanime_runtime_state)
else:
from sys import exit
exit(1)

View File

@@ -2,20 +2,555 @@ import click
from ...completion_functions import anime_titles_shell_complete from ...completion_functions import anime_titles_shell_complete
tags_available = {
"Cast": ["Polyamorous"],
"Cast / Main Cast": [
"Anti-Hero",
"Elderly Protagonist",
"Ensemble Cast",
"Estranged Family",
"Female Protagonist",
"Male Protagonist",
"Primarily Adult Cast",
"Primarily Animal Cast",
"Primarily Child Cast",
"Primarily Female Cast",
"Primarily Male Cast",
"Primarily Teen Cast",
],
"Cast / Traits": [
"Age Regression",
"Agender",
"Aliens",
"Amnesia",
"Angels",
"Anthropomorphism",
"Aromantic",
"Arranged Marriage",
"Artificial Intelligence",
"Asexual",
"Butler",
"Centaur",
"Chimera",
"Chuunibyou",
"Clone",
"Cosplay",
"Cowboys",
"Crossdressing",
"Cyborg",
"Delinquents",
"Demons",
"Detective",
"Dinosaurs",
"Disability",
"Dissociative Identities",
"Dragons",
"Dullahan",
"Elf",
"Fairy",
"Femboy",
"Ghost",
"Goblin",
"Gods",
"Gyaru",
"Hikikomori",
"Homeless",
"Idol",
"Kemonomimi",
"Kuudere",
"Maids",
"Mermaid",
"Monster Boy",
"Monster Girl",
"Nekomimi",
"Ninja",
"Nudity",
"Nun",
"Office Lady",
"Oiran",
"Ojou-sama",
"Orphan",
"Pirates",
"Robots",
"Samurai",
"Shrine Maiden",
"Skeleton",
"Succubus",
"Tanned Skin",
"Teacher",
"Tomboy",
"Transgender",
"Tsundere",
"Twins",
"Vampire",
"Veterinarian",
"Vikings",
"Villainess",
"VTuber",
"Werewolf",
"Witch",
"Yandere",
"Zombie",
],
"Demographic": ["Josei", "Kids", "Seinen", "Shoujo", "Shounen"],
"Setting": ["Matriarchy"],
"Setting / Scene": [
"Bar",
"Boarding School",
"Circus",
"Coastal",
"College",
"Desert",
"Dungeon",
"Foreign",
"Inn",
"Konbini",
"Natural Disaster",
"Office",
"Outdoor",
"Prison",
"Restaurant",
"Rural",
"School",
"School Club",
"Snowscape",
"Urban",
"Work",
],
"Setting / Time": [
"Achronological Order",
"Anachronism",
"Ancient China",
"Dystopian",
"Historical",
"Time Skip",
],
"Setting / Universe": [
"Afterlife",
"Alternate Universe",
"Augmented Reality",
"Omegaverse",
"Post-Apocalyptic",
"Space",
"Urban Fantasy",
"Virtual World",
],
"Technical": [
"4-koma",
"Achromatic",
"Advertisement",
"Anthology",
"CGI",
"Episodic",
"Flash",
"Full CGI",
"Full Color",
"No Dialogue",
"Non-fiction",
"POV",
"Puppetry",
"Rotoscoping",
"Stop Motion",
],
"Theme / Action": [
"Archery",
"Battle Royale",
"Espionage",
"Fugitive",
"Guns",
"Martial Arts",
"Spearplay",
"Swordplay",
],
"Theme / Arts": [
"Acting",
"Calligraphy",
"Classic Literature",
"Drawing",
"Fashion",
"Food",
"Makeup",
"Photography",
"Rakugo",
"Writing",
],
"Theme / Arts-Music": [
"Band",
"Classical Music",
"Dancing",
"Hip-hop Music",
"Jazz Music",
"Metal Music",
"Musical Theater",
"Rock Music",
],
"Theme / Comedy": ["Parody", "Satire", "Slapstick", "Surreal Comedy"],
"Theme / Drama": [
"Bullying",
"Class Struggle",
"Coming of Age",
"Conspiracy",
"Eco-Horror",
"Fake Relationship",
"Kingdom Management",
"Rehabilitation",
"Revenge",
"Suicide",
"Tragedy",
],
"Theme / Fantasy": [
"Alchemy",
"Body Swapping",
"Cultivation",
"Fairy Tale",
"Henshin",
"Isekai",
"Kaiju",
"Magic",
"Mythology",
"Necromancy",
"Shapeshifting",
"Steampunk",
"Super Power",
"Superhero",
"Wuxia",
"Youkai",
],
"Theme / Game": ["Board Game", "E-Sports", "Video Games"],
"Theme / Game-Card & Board Game": [
"Card Battle",
"Go",
"Karuta",
"Mahjong",
"Poker",
"Shogi",
],
"Theme / Game-Sport": [
"Acrobatics",
"Airsoft",
"American Football",
"Athletics",
"Badminton",
"Baseball",
"Basketball",
"Bowling",
"Boxing",
"Cheerleading",
"Cycling",
"Fencing",
"Fishing",
"Fitness",
"Football",
"Golf",
"Handball",
"Ice Skating",
"Judo",
"Lacrosse",
"Parkour",
"Rugby",
"Scuba Diving",
"Skateboarding",
"Sumo",
"Surfing",
"Swimming",
"Table Tennis",
"Tennis",
"Volleyball",
"Wrestling",
],
"Theme / Other": [
"Adoption",
"Animals",
"Astronomy",
"Autobiographical",
"Biographical",
"Body Horror",
"Cannibalism",
"Chibi",
"Cosmic Horror",
"Crime",
"Crossover",
"Death Game",
"Denpa",
"Drugs",
"Economics",
"Educational",
"Environmental",
"Ero Guro",
"Filmmaking",
"Found Family",
"Gambling",
"Gender Bending",
"Gore",
"Language Barrier",
"LGBTQ+ Themes",
"Lost Civilization",
"Marriage",
"Medicine",
"Memory Manipulation",
"Meta",
"Mountaineering",
"Noir",
"Otaku Culture",
"Pandemic",
"Philosophy",
"Politics",
"Proxy Battle",
"Psychosexual",
"Reincarnation",
"Religion",
"Royal Affairs",
"Slavery",
"Software Development",
"Survival",
"Terrorism",
"Torture",
"Travel",
"War",
],
"Theme / Other-Organisations": [
"Assassins",
"Criminal Organization",
"Cult",
"Firefighters",
"Gangs",
"Mafia",
"Military",
"Police",
"Triads",
"Yakuza",
],
"Theme / Other-Vehicle": [
"Aviation",
"Cars",
"Mopeds",
"Motorcycles",
"Ships",
"Tanks",
"Trains",
],
"Theme / Romance": [
"Age Gap",
"Bisexual",
"Boys' Love",
"Female Harem",
"Heterosexual",
"Love Triangle",
"Male Harem",
"Matchmaking",
"Mixed Gender Harem",
"Teens' Love",
"Unrequited Love",
"Yuri",
],
"Theme / Sci Fi": [
"Cyberpunk",
"Space Opera",
"Time Loop",
"Time Manipulation",
"Tokusatsu",
],
"Theme / Sci Fi-Mecha": ["Real Robot", "Super Robot"],
"Theme / Slice of Life": [
"Agriculture",
"Cute Boys Doing Cute Things",
"Cute Girls Doing Cute Things",
"Family Life",
"Horticulture",
"Iyashikei",
"Parenthood",
],
}
tags_available_list = []
for tag_category, tags_in_category in tags_available.items():
tags_available_list.extend(tags_in_category)
@click.command( @click.command(
help="Search for anime using anilists api and get top ~50 results", help="Search for anime using anilists api and get top ~50 results",
short_help="Search for anime", short_help="Search for anime",
) )
@click.argument("title", shell_complete=anime_titles_shell_complete) @click.option("--title", "-t", shell_complete=anime_titles_shell_complete)
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.option(
"--season",
help="The season the media was released",
type=click.Choice(["WINTER", "SPRING", "SUMMER", "FALL"]),
)
@click.option(
"--status",
"-S",
help="The media status of the anime",
type=click.Choice(
["FINISHED", "RELEASING", "NOT_YET_RELEASED", "CANCELLED", "HIATUS"]
),
)
@click.option(
"--sort",
"-s",
help="What to sort the search results on",
type=click.Choice(
[
"ID",
"ID_DESC",
"TITLE_ROMAJI",
"TITLE_ROMAJI_DESC",
"TITLE_ENGLISH",
"TITLE_ENGLISH_DESC",
"TITLE_NATIVE",
"TITLE_NATIVE_DESC",
"TYPE",
"TYPE_DESC",
"FORMAT",
"FORMAT_DESC",
"START_DATE",
"START_DATE_DESC",
"END_DATE",
"END_DATE_DESC",
"SCORE",
"SCORE_DESC",
"POPULARITY",
"POPULARITY_DESC",
"TRENDING",
"TRENDING_DESC",
"EPISODES",
"EPISODES_DESC",
"DURATION",
"DURATION_DESC",
"STATUS",
"STATUS_DESC",
"CHAPTERS",
"CHAPTERS_DESC",
"VOLUMES",
"VOLUMES_DESC",
"UPDATED_AT",
"UPDATED_AT_DESC",
"SEARCH_MATCH",
"FAVOURITES",
"FAVOURITES_DESC",
]
),
)
@click.option(
"--genres",
"-g",
multiple=True,
help="the genres to filter by",
type=click.Choice(
[
"Action",
"Adventure",
"Comedy",
"Drama",
"Ecchi",
"Fantasy",
"Horror",
"Mahou Shoujo",
"Mecha",
"Music",
"Mystery",
"Psychological",
"Romance",
"Sci-Fi",
"Slice of Life",
"Sports",
"Supernatural",
"Thriller",
"Hentai",
]
),
)
@click.option(
"--tags",
"-T",
multiple=True,
help="the tags to filter by",
type=click.Choice(tags_available_list),
)
@click.option(
"--media-format",
"-f",
multiple=True,
help="Media format",
type=click.Choice(
["TV", "TV_SHORT", "MOVIE", "SPECIAL", "OVA", "MUSIC", "NOVEL", "ONE_SHOT"]
),
)
@click.option(
"--year",
"-y",
type=click.Choice(
[
"2024",
"2023",
"2022",
"2021",
"2020",
"2019",
"2018",
"2017",
"2016",
"2015",
"2014",
"2013",
"2012",
"2011",
"2010",
"2009",
"2008",
"2007",
"2006",
"2005",
"2004",
"2000",
"1990",
"1980",
"1970",
"1960",
"1950",
"1940",
"1930",
"1920",
"1910",
"1900",
]
),
help="the year the media was released",
)
@click.pass_obj @click.pass_obj
def search(config, title): def search(
config, title, dump_json, season, status, sort, genres, tags, media_format, year
):
from ....anilist import AniList from ....anilist import AniList
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
success, search_results = AniList.search(title) success, search_results = AniList.search(
query=title,
sort=sort,
status=status,
genre_in=list(genres),
season=season,
tag_in=list(tags),
seasonYear=year,
format_in=list(media_format),
)
if success: if success:
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = search_results import json
anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(search_results))
else:
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = search_results
anilist_results_menu(config, fastanime_runtime_state)
else:
from sys import exit
exit(1)

View File

@@ -5,14 +5,30 @@ import click
help="Fetch the top 15 anime that are currently trending", help="Fetch the top 15 anime that are currently trending",
short_help="Trending anime 🔥🔥🔥", short_help="Trending anime 🔥🔥🔥",
) )
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def trending(config): def trending(config, dump_json):
from ....anilist import AniList from ....anilist import AniList
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
success, data = AniList.get_trending() success, data = AniList.get_trending()
if success: if success:
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = data import json
anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(data))
else:
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = data
anilist_results_menu(config, fastanime_runtime_state)
else:
from sys import exit
exit(1)

View File

@@ -4,14 +4,30 @@ import click
@click.command( @click.command(
help="Fetch the 15 most anticipited anime", short_help="View upcoming anime" help="Fetch the 15 most anticipited anime", short_help="View upcoming anime"
) )
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def upcoming(config): def upcoming(config, dump_json):
from ....anilist import AniList from ....anilist import AniList
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
success, data = AniList.get_upcoming_anime() success, data = AniList.get_upcoming_anime()
if success: if success:
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = data import json
anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(data))
else:
from ...interfaces.anilist_interfaces import anilist_results_menu
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = data
anilist_results_menu(config, fastanime_runtime_state)
else:
from sys import exit
exit(1)

View File

@@ -7,26 +7,40 @@ if TYPE_CHECKING:
@click.command(help="View anime you are watching") @click.command(help="View anime you are watching")
@click.option(
"--dump-json",
"-d",
is_flag=True,
help="Only print out the results dont open anilist menu",
)
@click.pass_obj @click.pass_obj
def watching(config: "Config"): def watching(config: "Config", dump_json):
from sys import exit
from ....anilist import AniList from ....anilist import AniList
from ...interfaces import anilist_interfaces
from ...utils.tools import FastAnimeRuntimeState, exit_app
if not config.user: if not config.user:
print("Not authenticated") print("Not authenticated")
print("Please run: fastanime anilist loggin") print("Please run: fastanime anilist loggin")
exit_app() exit(1)
anime_list = AniList.get_anime_list("CURRENT") anime_list = AniList.get_anime_list("CURRENT")
if not anime_list: if not anime_list:
return exit(1)
if not anime_list[0] or not anime_list[1]: if not anime_list[0] or not anime_list[1]:
return exit(1)
media = [ media = [
mediaListItem["media"] mediaListItem["media"]
for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"] for mediaListItem in anime_list[1]["data"]["Page"]["mediaList"]
] # pyright:ignore ] # pyright:ignore
anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore anime_list[1]["data"]["Page"]["media"] = media # pyright:ignore
fastanime_runtime_state = FastAnimeRuntimeState() if dump_json:
fastanime_runtime_state.anilist_data = anime_list[1] import json
anilist_interfaces.anilist_results_menu(config, fastanime_runtime_state)
print(json.dumps(anime_list[1]))
else:
from ...interfaces import anilist_interfaces
from ...utils.tools import FastAnimeRuntimeState
fastanime_runtime_state = FastAnimeRuntimeState()
fastanime_runtime_state.anilist_data = anime_list[1]
anilist_interfaces.anilist_results_menu(config, fastanime_runtime_state)

View File

@@ -1,4 +1,3 @@
import time
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
import click import click
@@ -28,8 +27,14 @@ if TYPE_CHECKING:
help="A range of episodes to download (start-end)", help="A range of episodes to download (start-end)",
) )
@click.option( @click.option(
"--force-unknown-ext", "--file",
"-f", "-f",
type=click.File(),
help="A file to read from all anime to download",
)
@click.option(
"--force-unknown-ext",
"-F",
help="This option forces yt-dlp to download extensions its not aware of", help="This option forces yt-dlp to download extensions its not aware of",
is_flag=True, is_flag=True,
) )
@@ -41,15 +46,43 @@ if TYPE_CHECKING:
default=True, default=True,
) )
@click.option("--verbose", "-v", is_flag=True, help="Download verbosely (everywhere)") @click.option("--verbose", "-v", is_flag=True, help="Download verbosely (everywhere)")
@click.option(
"--merge", "-m", is_flag=True, help="Merge the subfile with video using ffmpeg"
)
@click.option(
"--clean",
"-c",
is_flag=True,
help="After merging delete the original files",
)
@click.option(
"--wait-time",
"-w",
type=int,
help="The amount of time to wait after downloading is complete before the screen is completely cleared",
default=60,
)
@click.option(
"--prompt/--no-prompt",
help="Whether to prompt for anything instead just do the best thing",
default=True,
)
@click.pass_obj @click.pass_obj
def download( def download(
config: "Config", config: "Config",
anime_titles: list, anime_titles: tuple,
episode_range, episode_range,
file,
force_unknown_ext, force_unknown_ext,
silent, silent,
verbose, verbose,
merge,
clean,
wait_time,
prompt,
): ):
import time
from rich import print from rich import print
from rich.progress import Progress from rich.progress import Progress
from thefuzz import fuzz from thefuzz import fuzz
@@ -59,15 +92,29 @@ def download(
from ...libs.fzf import fzf from ...libs.fzf import fzf
from ...Utility.downloader.downloader import downloader from ...Utility.downloader.downloader import downloader
from ..utils.tools import exit_app from ..utils.tools import exit_app
from ..utils.utils import filter_by_quality, fuzzy_inquirer from ..utils.utils import (
filter_by_quality,
fuzzy_inquirer,
move_preferred_subtitle_lang_to_top,
)
anime_provider = AnimeProvider(config.provider) anime_provider = AnimeProvider(config.provider)
anilist_anime_info = None
translation_type = config.translation_type translation_type = config.translation_type
download_dir = config.downloads_dir download_dir = config.downloads_dir
if file:
contents = file.read()
anime_titles_from_file = tuple(
[title for title in contents.split("\n") if title]
)
file.close()
anime_titles = (*anime_titles_from_file, *anime_titles)
print(f"[green bold]Queued:[/] {anime_titles}") print(f"[green bold]Queued:[/] {anime_titles}")
for anime_title in anime_titles: for anime_title in anime_titles:
if anime_title == "EOF":
break
print(f"[green bold]Now Downloading: [/] {anime_title}") print(f"[green bold]Now Downloading: [/] {anime_title}")
# ---- search for anime ---- # ---- search for anime ----
with Progress() as progress: with Progress() as progress:
@@ -79,25 +126,40 @@ def download(
print("Search results failed") print("Search results failed")
input("Enter to retry") input("Enter to retry")
download( download(
config, anime_title, episode_range, force_unknown_ext, silent, verbose config,
anime_title,
episode_range,
file,
force_unknown_ext,
silent,
verbose,
merge,
clean,
wait_time,
prompt,
) )
return return
search_results = search_results["results"] search_results = search_results["results"]
if not search_results:
print("Nothing muches your search term")
continue
search_results_ = { search_results_ = {
search_result["title"]: search_result for search_result in search_results search_result["title"]: search_result for search_result in search_results
} }
if config.auto_select: if config.auto_select:
search_result = max( selected_anime_title = max(
search_results_.keys(), key=lambda title: fuzz.ratio(title, anime_title) search_results_.keys(), key=lambda title: fuzz.ratio(title, anime_title)
) )
print("[cyan]Auto selecting:[/] ", search_result) print("[cyan]Auto selecting:[/] ", selected_anime_title)
else: else:
choices = list(search_results_.keys()) choices = list(search_results_.keys())
if config.use_fzf: if config.use_fzf:
search_result = fzf.run(choices, "Please Select title: ", "FastAnime") selected_anime_title = fzf.run(
choices, "Please Select title: ", "FastAnime"
)
else: else:
search_result = fuzzy_inquirer( selected_anime_title = fuzzy_inquirer(
choices, choices,
"Please Select title", "Please Select title",
) )
@@ -106,13 +168,23 @@ def download(
with Progress() as progress: with Progress() as progress:
progress.add_task("Fetching Anime...", total=None) progress.add_task("Fetching Anime...", total=None)
anime: Anime | None = anime_provider.get_anime( anime: Anime | None = anime_provider.get_anime(
search_results_[search_result]["id"] search_results_[selected_anime_title]["id"]
) )
if not anime: if not anime:
print("Sth went wring anime no found") print("Sth went wring anime no found")
input("Enter to continue...") input("Enter to continue...")
download( download(
config, anime_title, episode_range, force_unknown_ext, silent, verbose config,
anime_title,
episode_range,
file,
force_unknown_ext,
silent,
verbose,
merge,
clean,
wait_time,
prompt,
) )
return return
@@ -146,6 +218,11 @@ def download(
else: else:
episodes_range = sorted(episodes, key=float) episodes_range = sorted(episodes, key=float)
if config.normalize_titles:
from ...libs.common.mini_anilist import get_basic_anime_info_by_title
anilist_anime_info = get_basic_anime_info_by_title(anime["title"])
# lets download em # lets download em
for episode in episodes_range: for episode in episodes_range:
try: try:
@@ -165,37 +242,12 @@ def download(
if config.server == "top": if config.server == "top":
with Progress() as progress: with Progress() as progress:
progress.add_task("Fetching top server...", total=None) progress.add_task("Fetching top server...", total=None)
server = next(streams, None) server_name = next(streams, None)
if not server: if not server_name:
print("Sth went wrong when fetching the server") print("Sth went wrong when fetching the server")
continue continue
stream_link = filter_by_quality(config.quality, server["links"])
if not stream_link:
print("[yellow bold]WARNING:[/] No streams found")
time.sleep(1)
print("Continuing...")
continue
link = stream_link["link"]
provider_headers = server["headers"]
episode_title = server["episode_title"]
else:
with Progress() as progress:
progress.add_task("Fetching servers", total=None)
# prompt for server selection
servers = {server["server"]: server for server in streams}
servers_names = list(servers.keys())
if config.server in servers_names:
server = config.server
else:
if config.use_fzf:
server = fzf.run(servers_names, "Select an link: ")
else:
server = fuzzy_inquirer(
servers_names,
"Select link",
)
stream_link = filter_by_quality( stream_link = filter_by_quality(
config.quality, servers[server]["links"] config.quality, server_name["links"]
) )
if not stream_link: if not stream_link:
print("[yellow bold]WARNING:[/] No streams found") print("[yellow bold]WARNING:[/] No streams found")
@@ -203,14 +255,58 @@ def download(
print("Continuing...") print("Continuing...")
continue continue
link = stream_link["link"] link = stream_link["link"]
provider_headers = servers[server]["headers"] provider_headers = server_name["headers"]
episode_title = server_name["episode_title"]
subtitles = server_name["subtitles"]
else:
with Progress() as progress:
progress.add_task("Fetching servers", total=None)
# prompt for server selection
servers = {server["server"]: server for server in streams}
servers_names = list(servers.keys())
if config.server in servers_names:
server_name = config.server
else:
if config.use_fzf:
server_name = fzf.run(servers_names, "Select an link: ")
else:
server_name = fuzzy_inquirer(
servers_names,
"Select link",
)
stream_link = filter_by_quality(
config.quality, servers[server_name]["links"]
)
if not stream_link:
print("[yellow bold]WARNING:[/] No streams found")
time.sleep(1)
print("Continuing...")
continue
link = stream_link["link"]
provider_headers = servers[server_name]["headers"]
episode_title = servers[server]["episode_title"] subtitles = servers[server_name]["subtitles"]
print(f"[purple]Now Downloading:[/] {search_result} Episode {episode}") episode_title = servers[server_name]["episode_title"]
if anilist_anime_info:
selected_anime_title = (
anilist_anime_info["title"][config.preferred_language]
or anilist_anime_info["title"]["romaji"]
or anilist_anime_info["title"]["english"]
)
import re
for episode_detail in anilist_anime_info["episodes"]:
if re.match(f"Episode {episode}", episode_detail["title"]):
episode_title = episode_detail["title"]
break
print(f"[purple]Now Downloading:[/] {episode_title}")
subtitles = move_preferred_subtitle_lang_to_top(
subtitles, config.sub_lang
)
downloader._download_file( downloader._download_file(
link, link,
anime["title"], selected_anime_title,
episode_title, episode_title,
download_dir, download_dir,
silent, silent,
@@ -218,10 +314,15 @@ def download(
force_unknown_ext, force_unknown_ext,
verbose, verbose,
headers=provider_headers, headers=provider_headers,
sub=subtitles[0]["url"] if subtitles else "",
merge=merge,
clean=clean,
prompt=prompt,
) )
except Exception as e: except Exception as e:
print(e) print(e)
time.sleep(1) time.sleep(1)
print("Continuing...") print("Continuing...")
print("Done Downloading") print("Done Downloading")
time.sleep(wait_time)
exit_app() exit_app()

View File

@@ -76,6 +76,9 @@ def grab(
continue continue
search_results = search_results["results"] search_results = search_results["results"]
if not search_results:
logger.error("no results for your search")
exit(1)
search_results_ = { search_results_ = {
search_result["title"]: search_result for search_result in search_results search_result["title"]: search_result for search_result in search_results
} }
@@ -88,13 +91,13 @@ def grab(
anime = anime_provider.get_anime(search_results_[search_result]["id"]) anime = anime_provider.get_anime(search_results_[search_result]["id"])
if not anime: if not anime:
exit(1) exit(1)
episodes = sorted(
anime["availableEpisodesDetail"][config.translation_type], key=float
)
if anime_info_only: if anime_info_only:
# grab only the anime data skipping all lines after this # grab only the anime data skipping all lines after this
grabbed_animes.append(anime) grabbed_animes.append(anime)
continue continue
episodes = sorted(
anime["availableEpisodesDetail"][config.translation_type], key=float
)
# where the magic happens # where the magic happens
if episode_range: if episode_range:

View File

@@ -35,9 +35,14 @@ def search(config: Config, anime_titles: str, episode_range: str):
from ...libs.rofi import Rofi from ...libs.rofi import Rofi
from ..utils.mpv import run_mpv from ..utils.mpv import run_mpv
from ..utils.tools import exit_app from ..utils.tools import exit_app
from ..utils.utils import filter_by_quality, fuzzy_inquirer from ..utils.utils import (
filter_by_quality,
fuzzy_inquirer,
move_preferred_subtitle_lang_to_top,
)
anime_provider = AnimeProvider(config.provider) anime_provider = AnimeProvider(config.provider)
anilist_anime_info = None
print(f"[green bold]Streaming:[/] {anime_titles}") print(f"[green bold]Streaming:[/] {anime_titles}")
for anime_title in anime_titles: for anime_title in anime_titles:
@@ -119,6 +124,11 @@ def search(config: Config, anime_titles: str, episode_range: str):
episodes_range = iter(episodes_range) episodes_range = iter(episodes_range)
if config.normalize_titles:
from ...libs.common.mini_anilist import get_basic_anime_info_by_title
anilist_anime_info = get_basic_anime_info_by_title(anime["title"])
def stream_anime(): def stream_anime():
clear() clear()
episode = None episode = None
@@ -177,6 +187,7 @@ def search(config: Config, anime_titles: str, episode_range: str):
stream_anime() stream_anime()
return return
link = stream_link["link"] link = stream_link["link"]
subtitles = server["subtitles"]
stream_headers = server["headers"] stream_headers = server["headers"]
episode_title = server["episode_title"] episode_title = server["episode_title"]
else: else:
@@ -207,15 +218,38 @@ def search(config: Config, anime_titles: str, episode_range: str):
return return
link = stream_link["link"] link = stream_link["link"]
stream_headers = servers[server]["headers"] stream_headers = servers[server]["headers"]
subtitles = servers[server]["subtitles"]
episode_title = servers[server]["episode_title"] episode_title = servers[server]["episode_title"]
print(f"[purple]Now Playing:[/] {search_result} Episode {episode}")
selected_anime_title = search_result
if anilist_anime_info:
selected_anime_title = (
anilist_anime_info["title"][config.preferred_language]
or anilist_anime_info["title"]["romaji"]
or anilist_anime_info["title"]["english"]
)
import re
for episode_detail in anilist_anime_info["episodes"]:
if re.match(f"Episode {episode}", episode_detail["title"]):
episode_title = episode_detail["title"]
break
print(
f"[purple]Now Playing:[/] {selected_anime_title} Episode {episode}"
)
subtitles = move_preferred_subtitle_lang_to_top(
subtitles, config.sub_lang
)
if config.sync_play: if config.sync_play:
from ..utils.syncplay import SyncPlayer from ..utils.syncplay import SyncPlayer
SyncPlayer(link, episode_title, headers=stream_headers) SyncPlayer(
link, episode_title, headers=stream_headers, subtitles=subtitles
)
else: else:
run_mpv(link, episode_title, headers=stream_headers) run_mpv(
link, episode_title, headers=stream_headers, subtitles=subtitles
)
except IndexError as e: except IndexError as e:
print(e) print(e)
input("Enter to continue") input("Enter to continue")

View File

@@ -10,8 +10,6 @@ query($query:String){
Page(perPage:50){ Page(perPage:50){
pageInfo{ pageInfo{
total total
currentPage
hasNextPage
} }
media(search:$query,type:ANIME){ media(search:$query,type:ANIME){
id id
@@ -46,20 +44,6 @@ def get_anime_titles(query: str, variables: dict = {}):
) )
anilist_data = response.json() anilist_data = response.json()
# ensuring you dont get blocked
if (
int(response.headers.get("X-RateLimit-Remaining", 0)) < 30
and not response.status_code == 500
):
print("Warning you are exceeding the allowed number of calls per minute")
logger.warning(
"You are exceeding the allowed number of calls per minute for the AniList api enforcing timeout"
)
print("Forced timeout will now be initiated")
import time
print("sleeping...")
time.sleep(1 * 60)
if response.status_code == 200: if response.status_code == 200:
eng_titles = [ eng_titles = [
anime["title"]["english"] anime["title"]["english"]
@@ -80,4 +64,16 @@ def get_anime_titles(query: str, variables: dict = {}):
def anime_titles_shell_complete(ctx, param, incomplete): def anime_titles_shell_complete(ctx, param, incomplete):
return [name for name in get_anime_titles(anime_title_query, {"query": incomplete})] incomplete = incomplete.strip()
if not incomplete:
incomplete = None
variables = {}
else:
variables = {"query": incomplete}
return get_anime_titles(anime_title_query, variables)
if __name__ == "__main__":
t = input("Enter title")
results = get_anime_titles(anime_title_query, {"query": t})
print(results)

View File

@@ -96,6 +96,8 @@ class Config(object):
"rofi_theme_input": "", "rofi_theme_input": "",
"rofi_theme_confirm": "", "rofi_theme_confirm": "",
"ffmpegthumnailer_seek_time": "-1", "ffmpegthumnailer_seek_time": "-1",
"sub_lang": "eng",
"normalize_titles": "true",
} }
) )
self.configparser.add_section("stream") self.configparser.add_section("stream")
@@ -109,6 +111,7 @@ class Config(object):
# --- set config values from file or using defaults --- # --- set config values from file or using defaults ---
self.downloads_dir = self.get_downloads_dir() self.downloads_dir = self.get_downloads_dir()
self.sub_lang = self.get_sub_lang()
self.provider = self.get_provider() self.provider = self.get_provider()
self.use_fzf = self.get_use_fzf() self.use_fzf = self.get_use_fzf()
self.use_rofi = self.get_use_rofi() self.use_rofi = self.get_use_rofi()
@@ -119,6 +122,7 @@ class Config(object):
self.sort_by = self.get_sort_by() self.sort_by = self.get_sort_by()
self.continue_from_history = self.get_continue_from_history() self.continue_from_history = self.get_continue_from_history()
self.auto_next = self.get_auto_next() self.auto_next = self.get_auto_next()
self.normalize_titles = self.get_normalize_titles()
self.auto_select = self.get_auto_select() self.auto_select = self.get_auto_select()
self.use_mpv_mod = self.get_use_mpv_mod() self.use_mpv_mod = self.get_use_mpv_mod()
self.quality = self.get_quality() self.quality = self.get_quality()
@@ -187,6 +191,9 @@ class Config(object):
def get_preferred_language(self): def get_preferred_language(self):
return self.configparser.get("general", "preferred_language") return self.configparser.get("general", "preferred_language")
def get_sub_lang(self):
return self.configparser.get("general", "sub_lang")
def get_downloads_dir(self): def get_downloads_dir(self):
return self.configparser.get("general", "downloads_dir") return self.configparser.get("general", "downloads_dir")
@@ -212,6 +219,9 @@ class Config(object):
def get_rofi_theme_confirm(self): def get_rofi_theme_confirm(self):
return self.configparser.get("general", "rofi_theme_confirm") return self.configparser.get("general", "rofi_theme_confirm")
def get_normalize_titles(self):
return self.configparser.getboolean("general", "normalize_titles")
# --- stream section --- # --- stream section ---
def get_skip(self): def get_skip(self):
return self.configparser.getboolean("stream", "skip") return self.configparser.getboolean("stream", "skip")
@@ -310,6 +320,9 @@ format = {self.format}
[general] [general]
# whether to normalize provider titles
normalize_titles = {self.normalize_titles}
# can be [allanime,animepahe] # can be [allanime,animepahe]
provider = {self.provider} provider = {self.provider}

View File

@@ -21,7 +21,11 @@ from ...Utility.data import anime_normalizer
from ...Utility.utils import anime_title_percentage_match from ...Utility.utils import anime_title_percentage_match
from ..utils.mpv import run_mpv from ..utils.mpv import run_mpv
from ..utils.tools import exit_app from ..utils.tools import exit_app
from ..utils.utils import filter_by_quality, fuzzy_inquirer from ..utils.utils import (
filter_by_quality,
fuzzy_inquirer,
move_preferred_subtitle_lang_to_top,
)
from .utils import aniskip from .utils import aniskip
if TYPE_CHECKING: if TYPE_CHECKING:
@@ -113,45 +117,53 @@ def media_player_controls(
current_episode_number, current_episode_number,
): ):
custom_args.extend(args) custom_args.extend(args)
subtitles = move_preferred_subtitle_lang_to_top(
selected_server["subtitles"], config.sub_lang
)
episode_title = selected_server["episode_title"]
if config.normalize_titles:
import re
for episode_detail in fastanime_runtime_state.selected_anime_anilist[
"streamingEpisodes"
]:
if re.match(
f"Episode {current_episode_number}", episode_detail["title"]
):
episode_title = episode_detail["title"]
break
if config.sync_play: if config.sync_play:
from ..utils.syncplay import SyncPlayer from ..utils.syncplay import SyncPlayer
stop_time, total_time = SyncPlayer( stop_time, total_time = SyncPlayer(
current_episode_stream_link, current_episode_stream_link,
selected_server["episode_title"], episode_title,
headers=selected_server["headers"], headers=selected_server["headers"],
subtitles=subtitles,
) )
elif config.use_mpv_mod: elif config.use_mpv_mod:
from ..utils.player import player from ..utils.player import player
mpv = player.create_player( player.create_player(
current_episode_stream_link, current_episode_stream_link,
config.anime_provider, config.anime_provider,
fastanime_runtime_state, fastanime_runtime_state,
config, config,
selected_server["episode_title"], episode_title,
start_time,
headers=selected_server["headers"], headers=selected_server["headers"],
subtitles=subtitles,
) )
# TODO: implement custom aniskip
if custom_args and None:
chapters_file = custom_args[0].split("=", 1)
script_opts = custom_args[1].split("=", 1)
mpv._set_property("chapters-file", chapters_file[1])
mpv._set_property("script-opts", script_opts[1])
if not start_time == "0":
mpv.start = start_time
mpv.wait_for_shutdown()
mpv.terminate()
stop_time = player.last_stop_time stop_time = player.last_stop_time
total_time = player.last_total_time total_time = player.last_total_time
else: else:
stop_time, total_time = run_mpv( stop_time, total_time = run_mpv(
current_episode_stream_link, current_episode_stream_link,
selected_server["episode_title"], episode_title,
start_time=start_time, start_time=start_time,
custom_args=custom_args, custom_args=custom_args,
headers=selected_server["headers"], headers=selected_server["headers"],
subtitles=subtitles,
) )
# either update the watch history to the next episode or current depending on progress # either update the watch history to the next episode or current depending on progress
@@ -502,6 +514,8 @@ def provider_anime_episode_servers_menu(
) )
if start_time != "0" and episode_in_history == current_episode_number: if start_time != "0" and episode_in_history == current_episode_number:
print("[green]Continuing from:[/] ", start_time) print("[green]Continuing from:[/] ", start_time)
else:
start_time = "0"
custom_args = [] custom_args = []
if config.skip: if config.skip:
if args := aniskip( if args := aniskip(
@@ -509,36 +523,45 @@ def provider_anime_episode_servers_menu(
current_episode_number, current_episode_number,
): ):
custom_args.extend(args) custom_args.extend(args)
subtitles = move_preferred_subtitle_lang_to_top(
selected_server["subtitles"], config.sub_lang
)
episode_title = selected_server["episode_title"]
if config.normalize_titles:
import re
for episode_detail in fastanime_runtime_state.selected_anime_anilist[
"streamingEpisodes"
]:
if re.match(f"Episode {current_episode_number}", episode_detail["title"]):
episode_title = episode_detail["title"]
break
if config.sync_play: if config.sync_play:
from ..utils.syncplay import SyncPlayer from ..utils.syncplay import SyncPlayer
stop_time, total_time = SyncPlayer( stop_time, total_time = SyncPlayer(
current_stream_link, current_stream_link,
selected_server["episode_title"], episode_title,
headers=selected_server["headers"], headers=selected_server["headers"],
subtitles=subtitles,
) )
elif config.use_mpv_mod: elif config.use_mpv_mod:
from ..utils.player import player from ..utils.player import player
mpv = player.create_player( if start_time == "0" and episode_in_history != current_episode_number:
start_time = "0"
player.create_player(
current_stream_link, current_stream_link,
anime_provider, anime_provider,
fastanime_runtime_state, fastanime_runtime_state,
config, config,
selected_server["episode_title"], episode_title,
start_time,
headers=selected_server["headers"], headers=selected_server["headers"],
subtitles=subtitles,
) )
# TODO: implement custom aniskip intergration
if custom_args and None:
chapters_file = custom_args[0].split("=", 1)
script_opts = custom_args[1].split("=", 1)
mpv._set_property("chapters-file", chapters_file[1])
mpv._set_property("script-opts", script_opts[1])
if not start_time == "0" and episode_in_history == current_episode_number:
mpv.start = start_time
mpv.wait_for_shutdown()
mpv.terminate()
stop_time = player.last_stop_time stop_time = player.last_stop_time
total_time = player.last_total_time total_time = player.last_total_time
current_episode_number = fastanime_runtime_state.provider_current_episode_number current_episode_number = fastanime_runtime_state.provider_current_episode_number
@@ -547,10 +570,11 @@ def provider_anime_episode_servers_menu(
start_time = "0" start_time = "0"
stop_time, total_time = run_mpv( stop_time, total_time = run_mpv(
current_stream_link, current_stream_link,
selected_server["episode_title"], episode_title,
start_time=start_time, start_time=start_time,
custom_args=custom_args, custom_args=custom_args,
headers=selected_server["headers"], headers=selected_server["headers"],
subtitles=subtitles,
) )
print("Finished at: ", stop_time) print("Finished at: ", stop_time)
@@ -664,11 +688,21 @@ def provider_anime_episodes_menu(
# prompt for episode number if not set # prompt for episode number if not set
if not current_episode_number or current_episode_number not in total_episodes: if not current_episode_number or current_episode_number not in total_episodes:
choices = [*total_episodes, "Back"] choices = [*total_episodes, "Back"]
preview = None
if config.preview:
from .utils import get_fzf_episode_preview
e = fastanime_runtime_state.selected_anime_anilist["episodes"]
if e:
eps = range(1, e)
else:
eps = total_episodes
preview = get_fzf_episode_preview(
fastanime_runtime_state.selected_anime_anilist, eps
)
if config.use_fzf: if config.use_fzf:
current_episode_number = fzf.run( current_episode_number = fzf.run(
choices, choices, prompt="Select Episode:", header=anime_title, preview=preview
prompt="Select Episode:",
header=anime_title,
) )
elif config.use_rofi: elif config.use_rofi:
current_episode_number = Rofi.run(choices, "Select Episode") current_episode_number = Rofi.run(choices, "Select Episode")
@@ -681,14 +715,14 @@ def provider_anime_episodes_menu(
if current_episode_number == "Back": if current_episode_number == "Back":
media_actions_menu(config, fastanime_runtime_state) media_actions_menu(config, fastanime_runtime_state)
return return
#
# try to get the start time and if not found default to "0" # # try to get the start time and if not found default to "0"
start_time = user_watch_history.get(str(anime_id_anilist), {}).get( # start_time = user_watch_history.get(str(anime_id_anilist), {}).get(
"start_time", "0" # "start_time", "0"
) # )
config.update_watch_history( # config.update_watch_history(
anime_id_anilist, current_episode_number, start_time=start_time # anime_id_anilist, current_episode_number, start_time=start_time
) # )
# update runtime data # update runtime data
fastanime_runtime_state.provider_available_episodes = total_episodes fastanime_runtime_state.provider_available_episodes = total_episodes
@@ -936,7 +970,7 @@ def media_actions_menu(
score = Rofi.ask("Enter Score", is_int=True) score = Rofi.ask("Enter Score", is_int=True)
score = max(100, min(0, score)) score = max(100, min(0, score))
else: else:
score = inquirer.number( score = inquirer.number( # pyright:ignore
message="Enter the score:", message="Enter the score:",
min_allowed=0, min_allowed=0,
max_allowed=100, max_allowed=100,
@@ -1009,6 +1043,42 @@ def media_actions_menu(
media_actions_menu(config, fastanime_runtime_state) media_actions_menu(config, fastanime_runtime_state)
def _change_player(
config: "Config", fastanime_runtime_state: "FastAnimeRuntimeState"
):
"""Change the translation type to use
Args:
config: [TODO:description]
fastanime_runtime_state: [TODO:description]
"""
# prompt for new translation type
options = ["syncplay", "mpv-mod", "default"]
if config.use_fzf:
player = fzf.run(
options,
prompt="Select Player:",
)
elif config.use_rofi:
player = Rofi.run(options, "Select Player: ")
else:
player = fuzzy_inquirer(
options,
"Select Player",
)
# update internal config
if player == "syncplay":
config.sync_play = True
config.use_mpv_mod = False
else:
config.sync_play = False
if player == "mpv-mod":
config.use_mpv_mod = True
else:
config.use_mpv_mod = False
media_actions_menu(config, fastanime_runtime_state)
def _view_info(config: "Config", fastanime_runtime_state: "FastAnimeRuntimeState"): def _view_info(config: "Config", fastanime_runtime_state: "FastAnimeRuntimeState"):
"""helper function to view info of an anime from terminal """helper function to view info of an anime from terminal
@@ -1122,7 +1192,9 @@ def media_actions_menu(
config: [TODO:description] config: [TODO:description]
fastanime_runtime_state: [TODO:description] fastanime_runtime_state: [TODO:description]
""" """
options = ["allanime", "animepahe"] from ...libs.anime_provider import anime_sources
options = list(anime_sources.keys())
if config.use_fzf: if config.use_fzf:
provider = fzf.run( provider = fzf.run(
options, prompt="Select Translation Type:", header="Language Options" options, prompt="Select Translation Type:", header="Language Options"
@@ -1137,7 +1209,7 @@ def media_actions_menu(
config.provider = provider config.provider = provider
config.anime_provider.provider = provider config.anime_provider.provider = provider
config.anime_provider.lazyload_provider() config.anime_provider.lazyload_provider(provider)
media_actions_menu(config, fastanime_runtime_state) media_actions_menu(config, fastanime_runtime_state)
@@ -1175,6 +1247,7 @@ def media_actions_menu(
f"{'📖 ' if icons else ''}View Info": _view_info, f"{'📖 ' if icons else ''}View Info": _view_info,
f"{'🎧 ' if icons else ''}Change Translation Type": _change_translation_type, f"{'🎧 ' if icons else ''}Change Translation Type": _change_translation_type,
f"{'💽 ' if icons else ''}Change Provider": _change_provider, f"{'💽 ' if icons else ''}Change Provider": _change_provider,
f"{'💽 ' if icons else ''}Change Player": _change_player,
f"{'🔘 ' if icons else ''}Toggle auto select anime": _toggle_auto_select, # WARN: problematic if you choose an anime that doesnt match id f"{'🔘 ' if icons else ''}Toggle auto select anime": _toggle_auto_select, # WARN: problematic if you choose an anime that doesnt match id
f"{'💠 ' if icons else ''}Toggle auto next episode": _toggle_auto_next, f"{'💠 ' if icons else ''}Toggle auto next episode": _toggle_auto_next,
f"{'🔘 ' if icons else ''}Toggle continue from history": _toggle_continue_from_history, f"{'🔘 ' if icons else ''}Toggle continue from history": _toggle_continue_from_history,
@@ -1244,9 +1317,9 @@ def anilist_results_menu(
choices = [*anime_data.keys(), "Back"] choices = [*anime_data.keys(), "Back"]
if config.use_fzf: if config.use_fzf:
if config.preview: if config.preview:
from .utils import get_fzf_preview from .utils import get_fzf_anime_preview
preview = get_fzf_preview(search_results, anime_data.keys()) preview = get_fzf_anime_preview(search_results, anime_data.keys())
selected_anime_title = fzf.run( selected_anime_title = fzf.run(
choices, choices,
prompt="Select Anime: ", prompt="Select Anime: ",
@@ -1427,6 +1500,9 @@ def fastanime_main_menu(
else: else:
config.load_config() config.load_config()
config.anime_provider.provider = config.provider
config.anime_provider.lazyload_provider(config.provider)
fastanime_main_menu(config, fastanime_runtime_state) fastanime_main_menu(config, fastanime_runtime_state)
icons = config.icons icons = config.icons

View File

@@ -168,7 +168,89 @@ def get_rofi_icons(
logger.error("%r generated an exception: %s" % (url, e)) logger.error("%r generated an exception: %s" % (url, e))
def get_fzf_preview( # get rofi icons
def get_fzf_episode_preview(
anilist_result: AnilistBaseMediaDataSchema, episodes, workers=None, wait=False
):
"""A helper function to make sure that the images are downloaded so they can be used as icons
Args:
titles (list[str]): sanitized titles of the anime; NOTE: its important that they are sanitized since they are used as the filenames of the images
workers ([TODO:parameter]): Number of threads to use to download the images; defaults to as many as possible
anilist_results: the anilist results from an anilist action
"""
HEADER_COLOR = 215, 0, 95
import re
def _worker():
# use concurrency to download the images as fast as possible
with concurrent.futures.ThreadPoolExecutor(max_workers=workers) as executor:
# load the jobs
future_to_url = {}
for episode in episodes:
episode_title = ""
image_url = ""
for episode_detail in anilist_result["streamingEpisodes"]:
if re.match(f"Episode {episode}", episode_detail["title"]):
episode_title = episode_detail["title"]
image_url = episode_detail["thumbnail"]
if episode_title and image_url:
# actual link to download image from
if not image_url:
continue
future_to_url[
executor.submit(save_image_from_url, image_url, episode)
] = image_url
template = textwrap.dedent(
f"""
{get_true_fg('Anime Title:',*HEADER_COLOR)} {anilist_result['title']['romaji'] or anilist_result['title']['english']}
{get_true_fg('Episode Title:',*HEADER_COLOR)} {episode_title}
"""
)
future_to_url[
executor.submit(save_info_from_str, template, episode)
] = episode_title
# execute the jobs
for future in concurrent.futures.as_completed(future_to_url):
url = future_to_url[future]
try:
future.result()
except Exception as e:
logger.error("%r generated an exception: %s" % (url, e))
background_worker = Thread(
target=_worker,
)
# ensure images and info exists
background_worker.daemon = True
background_worker.start()
# the preview script is in bash so making sure fzf doesnt use any other shell lang to process the preview script
os.environ["SHELL"] = shutil.which("bash") or "bash"
preview = """
%s
if [ -s %s/{} ]; then fzf-preview %s/{}
else echo Loading...
fi
if [ -s %s/{} ]; then cat %s/{}
else echo Loading...
fi
""" % (
fzf_preview,
IMAGES_CACHE_DIR,
IMAGES_CACHE_DIR,
ANIME_INFO_CACHE_DIR,
ANIME_INFO_CACHE_DIR,
)
if wait:
background_worker.join()
return preview
def get_fzf_anime_preview(
anilist_results: list[AnilistBaseMediaDataSchema], titles, wait=False anilist_results: list[AnilistBaseMediaDataSchema], titles, wait=False
): ):
"""A helper function that constructs data to be used for the fzf preview """A helper function that constructs data to be used for the fzf preview

View File

@@ -55,6 +55,7 @@ def run_mpv(
ytdl_format="", ytdl_format="",
custom_args=[], custom_args=[],
headers={}, headers={},
subtitles=[],
): ):
# Determine if mpv is available # Determine if mpv is available
MPV = shutil.which("mpv") MPV = shutil.which("mpv")
@@ -108,6 +109,8 @@ def run_mpv(
for header_name, header_value in headers.items(): for header_name, header_value in headers.items():
mpv_headers += f"{header_name}:{header_value}," mpv_headers += f"{header_name}:{header_value},"
mpv_args.append(mpv_headers) mpv_args.append(mpv_headers)
for subtitle in subtitles:
mpv_args.append(f"--sub-file={subtitle['url']}")
if start_time != "0": if start_time != "0":
mpv_args.append(f"--start={start_time}") mpv_args.append(f"--start={start_time}")
if title: if title:

View File

@@ -3,7 +3,7 @@ from typing import TYPE_CHECKING
import mpv import mpv
from ...anilist import AniList from ...anilist import AniList
from .utils import filter_by_quality from .utils import filter_by_quality, move_preferred_subtitle_lang_to_top
if TYPE_CHECKING: if TYPE_CHECKING:
from typing import Literal from typing import Literal
@@ -22,6 +22,7 @@ def format_time(duration_in_secs: float):
class MpvPlayer(object): class MpvPlayer(object):
anime_provider: "AnimeProvider" anime_provider: "AnimeProvider"
config: "Config" config: "Config"
subs = []
mpv_player: "mpv.MPV" mpv_player: "mpv.MPV"
last_stop_time: str = "0" last_stop_time: str = "0"
last_total_time: str = "0" last_total_time: str = "0"
@@ -113,7 +114,7 @@ class MpvPlayer(object):
) )
if not episode_streams: if not episode_streams:
self.mpv_player.show_text("No streams were found") self.mpv_player.show_text("No streams were found")
return None return
# always select the first # always select the first
if server == "top": if server == "top":
@@ -131,8 +132,20 @@ class MpvPlayer(object):
self.mpv_player.show_text( self.mpv_player.show_text(
f"Invalid server!!; servers available are: {episode_streams_dict.keys()}", f"Invalid server!!; servers available are: {episode_streams_dict.keys()}",
) )
return None return
self.current_media_title = selected_server["episode_title"] self.current_media_title = selected_server["episode_title"]
if config.normalize_titles:
import re
for episode_detail in fastanime_runtime_state.selected_anime_anilist[
"streamingEpisodes"
]:
if re.match(
f"Episode {current_episode_number}", episode_detail["title"]
):
self.current_media_title = episode_detail["title"]
break
links = selected_server["links"] links = selected_server["links"]
stream_link_ = filter_by_quality(quality, links) stream_link_ = filter_by_quality(quality, links)
@@ -142,6 +155,9 @@ class MpvPlayer(object):
self.mpv_player._set_property("start", "0") self.mpv_player._set_property("start", "0")
stream_link = stream_link_["link"] stream_link = stream_link_["link"]
fastanime_runtime_state.provider_current_episode_stream_link = stream_link fastanime_runtime_state.provider_current_episode_stream_link = stream_link
self.subs = move_preferred_subtitle_lang_to_top(
selected_server["subtitles"], config.sub_lang
)
return stream_link return stream_link
def create_player( def create_player(
@@ -151,8 +167,11 @@ class MpvPlayer(object):
fastanime_runtime_state, fastanime_runtime_state,
config: "Config", config: "Config",
title, title,
start_time,
headers={}, headers={},
subtitles=[],
): ):
self.subs = subtitles
self.anime_provider = anime_provider self.anime_provider = anime_provider
self.fastanime_runtime_state = fastanime_runtime_state self.fastanime_runtime_state = fastanime_runtime_state
self.config = config self.config = config
@@ -171,17 +190,6 @@ class MpvPlayer(object):
osc=True, osc=True,
ytdl=True, ytdl=True,
) )
mpv_player.force_window = config.force_window
# mpv_player.cache = "yes"
# mpv_player.cache_pause = "no"
mpv_player.title = title
mpv_headers = ""
if headers:
for header_name, header_value in headers.items():
mpv_headers += f"{header_name}:{header_value},"
mpv_player.http_header_fields = mpv_headers
mpv_player.play(stream_link)
# -- events -- # -- events --
@mpv_player.event_callback("file-loaded") @mpv_player.event_callback("file-loaded")
@@ -190,6 +198,22 @@ class MpvPlayer(object):
self.player_fetching = False self.player_fetching = False
if isinstance(d, float): if isinstance(d, float):
self.last_total_time = format_time(d) self.last_total_time = format_time(d)
try:
if not mpv_player.core_shutdown:
if self.subs:
for i, subtitle in enumerate(self.subs):
if i == 0:
flag = "select"
else:
flag = "auto"
mpv_player.sub_add(
subtitle["url"], flag, None, subtitle["language"]
)
self.subs = []
except mpv.ShutdownError:
pass
except Exception:
pass
@mpv_player.property_observer("time-pos") @mpv_player.property_observer("time-pos")
def handle_time_start_update(*args): def handle_time_start_update(*args):
@@ -218,7 +242,9 @@ class MpvPlayer(object):
def _next_episode(): def _next_episode():
url = self.get_episode("next") url = self.get_episode("next")
if url: if url:
mpv_player.loadfile(url, options=f"title={self.current_media_title}") mpv_player.loadfile(
url,
)
mpv_player.title = self.current_media_title mpv_player.title = self.current_media_title
@mpv_player.on_key_press("shift+p") @mpv_player.on_key_press("shift+p")
@@ -327,7 +353,23 @@ class MpvPlayer(object):
mpv_player.register_message_handler("select-quality", select_quality) mpv_player.register_message_handler("select-quality", select_quality)
self.mpv_player = mpv_player self.mpv_player = mpv_player
return mpv_player mpv_player.force_window = config.force_window
# mpv_player.cache = "yes"
# mpv_player.cache_pause = "no"
mpv_player.title = title
mpv_headers = ""
if headers:
for header_name, header_value in headers.items():
mpv_headers += f"{header_name}:{header_value},"
mpv_player.http_header_fields = mpv_headers
mpv_player.play(stream_link)
if not start_time == "0":
mpv_player.start = start_time
mpv_player.wait_for_shutdown()
mpv_player.terminate()
player = MpvPlayer() player = MpvPlayer()

View File

@@ -4,7 +4,7 @@ import subprocess
from .tools import exit_app from .tools import exit_app
def SyncPlayer(url: str, anime_title=None, headers={}, *args): def SyncPlayer(url: str, anime_title=None, headers={}, subtitles=[], *args):
# TODO: handle m3u8 multi quality streams # TODO: handle m3u8 multi quality streams
# #
# check for SyncPlay # check for SyncPlay
@@ -20,6 +20,8 @@ def SyncPlayer(url: str, anime_title=None, headers={}, *args):
for header_name, header_value in headers.items(): for header_name, header_value in headers.items():
mpv_headers += f"{header_name}:{header_value}," mpv_headers += f"{header_name}:{header_value},"
mpv_args.append(mpv_headers) mpv_args.append(mpv_headers)
for subtitle in subtitles:
mpv_args.append(f"--sub-file={subtitle['url']}")
if not anime_title: if not anime_title:
subprocess.run( subprocess.run(
[ [

View File

@@ -15,25 +15,14 @@ class FastAnimeRuntimeState(dict):
def exit_app(exit_code=0, *args): def exit_app(exit_code=0, *args):
import os
import shutil
import sys import sys
from rich.console import Console
from ...constants import APP_NAME, ICON_PATH, USER_NAME from ...constants import APP_NAME, ICON_PATH, USER_NAME
def is_running_in_terminal(): console = Console()
try: if not console.is_terminal:
shutil.get_terminal_size()
return (
sys.stdin
and sys.stdin.isatty()
and sys.stdout.isatty()
and os.getenv("TERM") is not None
)
except OSError:
return False
if not is_running_in_terminal():
from plyer import notification from plyer import notification
notification.notify( notification.notify(
@@ -43,7 +32,6 @@ def exit_app(exit_code=0, *args):
title="Shutting down", title="Shutting down",
) # pyright:ignore ) # pyright:ignore
else: else:
from rich import print console.clear()
console.print("Have a good day :smile:", USER_NAME)
print("Have a good day :smile:", USER_NAME)
sys.exit(exit_code) sys.exit(exit_code)

View File

@@ -19,6 +19,46 @@ BG_GREEN = "\033[48;2;120;233;12;m"
GREEN = "\033[38;2;45;24;45;m" GREEN = "\033[38;2;45;24;45;m"
def get_requested_quality_or_default_to_first(url, quality):
import yt_dlp
with yt_dlp.YoutubeDL({"quiet": True, "silent": True, "no_warnings": True}) as ydl:
m3u8_info = ydl.extract_info(url, False)
if not m3u8_info:
return
m3u8_formats = m3u8_info["formats"]
quality = int(quality)
quality_u = quality - 80
quality_l = quality + 80
for m3u8_format in m3u8_formats:
if m3u8_format["height"] == quality or (
m3u8_format["height"] < quality_u and m3u8_format["height"] > quality_l
):
return m3u8_format["url"]
else:
return m3u8_formats[0]["url"]
def move_preferred_subtitle_lang_to_top(sub_list, lang_str):
"""Moves the dictionary with the given ID to the front of the list.
Args:
sub_list: list of subs
lang_str: the sub lang pref
Returns:
The modified list.
"""
import re
for i, d in enumerate(sub_list):
if re.search(lang_str, d["language"], re.IGNORECASE):
sub_list.insert(0, sub_list.pop(i))
break
return sub_list
def filter_by_quality(quality: str, stream_links: "list[EpisodeStream]", default=True): def filter_by_quality(quality: str, stream_links: "list[EpisodeStream]", default=True):
"""Helper function used to filter a list of EpisodeStream objects to one that has a corresponding quality """Helper function used to filter a list of EpisodeStream objects to one that has a corresponding quality
@@ -106,7 +146,7 @@ def fuzzy_inquirer(choices: list, prompt: str, **kwargs):
from click import clear from click import clear
clear() clear()
action = inquirer.fuzzy( action = inquirer.fuzzy( # pyright:ignore
prompt, prompt,
choices, choices,
height="100%", height="100%",

View File

@@ -3,7 +3,9 @@ import sys
from pathlib import Path from pathlib import Path
from platform import system from platform import system
from . import APP_NAME, AUTHOR, __version__ import click
from . import APP_NAME, __version__
PLATFORM = system() PLATFORM = system()
@@ -17,19 +19,20 @@ if PLATFORM == "Windows":
ICON_PATH = os.path.join(ASSETS_DIR, "logo.ico") ICON_PATH = os.path.join(ASSETS_DIR, "logo.ico")
else: else:
ICON_PATH = os.path.join(ASSETS_DIR, "logo.png") ICON_PATH = os.path.join(ASSETS_DIR, "logo.png")
PREVIEW_IMAGE = os.path.join(ASSETS_DIR, "preview") # PREVIEW_IMAGE = os.path.join(ASSETS_DIR, "preview")
# ----- user configs and data ----- # ----- user configs and data -----
S_PLATFORM = sys.platform S_PLATFORM = sys.platform
APP_DATA_DIR = click.get_app_dir(APP_NAME)
if S_PLATFORM == "win32": if S_PLATFORM == "win32":
# app data # app data
app_data_dir_base = os.getenv("LOCALAPPDATA") # app_data_dir_base = os.getenv("LOCALAPPDATA")
if not app_data_dir_base: # if not app_data_dir_base:
raise RuntimeError("Could not determine app data dir please report to devs") # raise RuntimeError("Could not determine app data dir please report to devs")
APP_DATA_DIR = os.path.join(app_data_dir_base, AUTHOR, APP_NAME) # APP_DATA_DIR = os.path.join(app_data_dir_base, AUTHOR, APP_NAME)
#
# cache dir # cache dir
APP_CACHE_DIR = os.path.join(APP_DATA_DIR, "cache") APP_CACHE_DIR = os.path.join(APP_DATA_DIR, "cache")
@@ -39,9 +42,9 @@ if S_PLATFORM == "win32":
elif S_PLATFORM == "darwin": elif S_PLATFORM == "darwin":
# app data # app data
app_data_dir_base = os.path.expanduser("~/Library/Application Support") # app_data_dir_base = os.path.expanduser("~/Library/Application Support")
APP_DATA_DIR = os.path.join(app_data_dir_base, APP_NAME, __version__) # APP_DATA_DIR = os.path.join(app_data_dir_base, APP_NAME, __version__)
#
# cache dir # cache dir
cache_dir_base = os.path.expanduser("~/Library/Caches") cache_dir_base = os.path.expanduser("~/Library/Caches")
APP_CACHE_DIR = os.path.join(cache_dir_base, APP_NAME, __version__) APP_CACHE_DIR = os.path.join(cache_dir_base, APP_NAME, __version__)
@@ -50,12 +53,12 @@ elif S_PLATFORM == "darwin":
video_dir_base = os.path.expanduser("~/Movies") video_dir_base = os.path.expanduser("~/Movies")
USER_VIDEOS_DIR = os.path.join(video_dir_base, APP_NAME) USER_VIDEOS_DIR = os.path.join(video_dir_base, APP_NAME)
else: else:
# app data # # app data
app_data_dir_base = os.environ.get("XDG_CONFIG_HOME", "") # app_data_dir_base = os.environ.get("XDG_CONFIG_HOME", "")
if not app_data_dir_base.strip(): # if not app_data_dir_base.strip():
app_data_dir_base = os.path.expanduser("~/.config") # app_data_dir_base = os.path.expanduser("~/.config")
APP_DATA_DIR = os.path.join(app_data_dir_base, APP_NAME) # APP_DATA_DIR = os.path.join(app_data_dir_base, APP_NAME)
#
# cache dir # cache dir
cache_dir_base = os.environ.get("XDG_CACHE_HOME", "") cache_dir_base = os.environ.get("XDG_CACHE_HOME", "")
if not cache_dir_base.strip(): if not cache_dir_base.strip():

View File

@@ -309,9 +309,13 @@ class AniListApi:
status_not_in: list[str] | None = None, status_not_in: list[str] | None = None,
endDate_greater: int | None = None, endDate_greater: int | None = None,
endDate_lesser: int | None = None, endDate_lesser: int | None = None,
start_greater: int | None = None, startDate_greater: int | None = None,
start_lesser: int | None = None, startDate_lesser: int | None = None,
startDate: str | None = None,
seasonYear: str | None = None,
page: int | None = None, page: int | None = None,
season: str | None = None,
format_in: list[str] | None = None,
type="ANIME", type="ANIME",
**kwargs, **kwargs,
): ):
@@ -320,7 +324,7 @@ class AniListApi:
""" """
variables = {} variables = {}
for key, val in list(locals().items())[1:]: for key, val in list(locals().items())[1:]:
if val is not None and key not in ["variables"]: if val and key not in ["variables"]:
variables[key] = val variables[key] = val
search_results = self.get_data(search_query, variables=variables) search_results = self.get_data(search_query, variables=variables)
return search_results return search_results

View File

@@ -147,6 +147,11 @@ query ($userId: Int, $status: MediaListStatus,$type:MediaType) {
id id
} }
popularity popularity
streamingEpisodes{
title
thumbnail
}
favourites favourites
averageScore averageScore
episodes episodes
@@ -221,11 +226,15 @@ $popularity_greater:Int,\
$popularity_lesser:Int,\ $popularity_lesser:Int,\
$averageScore_greater:Int,\ $averageScore_greater:Int,\
$averageScore_lesser:Int,\ $averageScore_lesser:Int,\
$seasonYear:Int,\
$startDate_greater:FuzzyDateInt,\ $startDate_greater:FuzzyDateInt,\
$startDate_lesser:FuzzyDateInt,\ $startDate_lesser:FuzzyDateInt,\
$startDate:FuzzyDateInt,\
$endDate_greater:FuzzyDateInt,\ $endDate_greater:FuzzyDateInt,\
$endDate_lesser:FuzzyDateInt,\ $endDate_lesser:FuzzyDateInt,\
$format_in:[MediaFormat],\
$type:MediaType\ $type:MediaType\
$season:MediaSeason\
" "
# FuzzyDateInt = (yyyymmdd) # FuzzyDateInt = (yyyymmdd)
# MediaStatus = (FINISHED,RELEASING,NOT_YET_RELEASED,CANCELLED,HIATUS) # MediaStatus = (FINISHED,RELEASING,NOT_YET_RELEASED,CANCELLED,HIATUS)
@@ -247,6 +256,7 @@ query($query:String,%s){
tag_not_in:$tag_not_in, tag_not_in:$tag_not_in,
status_in:$status_in, status_in:$status_in,
status:$status, status:$status,
startDate:$startDate,
status_not_in:$status_not_in, status_not_in:$status_not_in,
popularity_greater:$popularity_greater, popularity_greater:$popularity_greater,
popularity_lesser:$popularity_lesser, popularity_lesser:$popularity_lesser,
@@ -256,7 +266,10 @@ query($query:String,%s){
startDate_lesser:$startDate_lesser, startDate_lesser:$startDate_lesser,
endDate_greater:$endDate_greater, endDate_greater:$endDate_greater,
endDate_lesser:$endDate_lesser, endDate_lesser:$endDate_lesser,
format_in:$format_in,
sort:$sort, sort:$sort,
season:$season,
seasonYear:$seasonYear,
type:$type type:$type
) )
{ {
@@ -281,6 +294,11 @@ query($query:String,%s){
progress progress
} }
popularity popularity
streamingEpisodes{
title
thumbnail
}
favourites favourites
averageScore averageScore
episodes episodes
@@ -338,6 +356,11 @@ query($type:MediaType){
id id
} }
popularity popularity
streamingEpisodes{
title
thumbnail
}
favourites favourites
averageScore averageScore
genres genres
@@ -404,6 +427,15 @@ query($type:MediaType){
progress progress
} }
popularity popularity
streamingEpisodes{
title
thumbnail
}
streamingEpisodes{
title
thumbnail
}
favourites favourites
averageScore averageScore
episodes episodes
@@ -464,6 +496,11 @@ query($type:MediaType){
progress progress
} }
popularity popularity
streamingEpisodes{
title
thumbnail
}
episodes episodes
favourites favourites
averageScore averageScore
@@ -519,6 +556,11 @@ query($type:MediaType){
} }
popularity popularity
streamingEpisodes{
title
thumbnail
}
favourites favourites
averageScore averageScore
description description
@@ -583,6 +625,11 @@ query($type:MediaType){
progress progress
} }
popularity popularity
streamingEpisodes{
title
thumbnail
}
favourites favourites
averageScore averageScore
description description
@@ -651,6 +698,11 @@ query($type:MediaType){
genres genres
averageScore averageScore
popularity popularity
streamingEpisodes{
title
thumbnail
}
favourites favourites
tags { tags {
name name
@@ -745,6 +797,11 @@ query ($id: Int,$type:MediaType) {
genres genres
averageScore averageScore
popularity popularity
streamingEpisodes{
title
thumbnail
}
favourites favourites
tags { tags {
name name
@@ -819,6 +876,11 @@ query ($page: Int,$type:MediaType) {
progress progress
} }
popularity popularity
streamingEpisodes{
title
thumbnail
}
favourites favourites
averageScore averageScore
genres genres
@@ -935,6 +997,11 @@ query($id:Int){
countryOfOrigin countryOfOrigin
averageScore averageScore
popularity popularity
streamingEpisodes{
title
thumbnail
}
favourites favourites
source source
hashtag hashtag

View File

@@ -136,6 +136,11 @@ class AnilistMediaListProperties(TypedDict):
hiddenFromStatusLists: bool hiddenFromStatusLists: bool
class StreamingEpisode(TypedDict):
title: str
thumbnail: str
class AnilistBaseMediaDataSchema(TypedDict): class AnilistBaseMediaDataSchema(TypedDict):
""" """
This a convenience class is used to type the received Anilist data to enhance dev experience This a convenience class is used to type the received Anilist data to enhance dev experience
@@ -159,6 +164,7 @@ class AnilistBaseMediaDataSchema(TypedDict):
status: str status: str
nextAiringEpisode: AnilistMediaNextAiringEpisode nextAiringEpisode: AnilistMediaNextAiringEpisode
season: str season: str
streamingEpisodes: list[StreamingEpisode]
seasonYear: int seasonYear: int
duration: int duration: int
synonyms: list[str] synonyms: list[str]

View File

@@ -1,13 +1,11 @@
from .allanime.constants import SERVERS_AVAILABLE as ALLANIME_SERVERS
from .animepahe.constants import SERVERS_AVAILABLE as ANIMEPAHESERVERS
from .aniwatch.constants import SERVERS_AVAILABLE as ANIWATCHSERVERS
anime_sources = { anime_sources = {
"allanime": "api.AllAnimeAPI", "allanime": "api.AllAnimeAPI",
"animepahe": "api.AnimePaheApi", "animepahe": "api.AnimePaheApi",
"aniwatch": "api.AniWatchApi",
"aniwave": "api.AniWaveApi",
} }
SERVERS_AVAILABLE = [ SERVERS_AVAILABLE = [*ALLANIME_SERVERS, *ANIMEPAHESERVERS, *ANIWATCHSERVERS]
"sharepoint",
"dropbox",
"gogoanime",
"weTransfer",
"wixmp",
"kwik",
"Yt",
]

View File

@@ -11,12 +11,7 @@ from requests.exceptions import Timeout
from ...anime_provider.base_provider import AnimeProvider from ...anime_provider.base_provider import AnimeProvider
from ..utils import give_random_quality, one_digit_symmetric_xor from ..utils import give_random_quality, one_digit_symmetric_xor
from .constants import ( from .constants import ALLANIME_API_ENDPOINT, ALLANIME_BASE, ALLANIME_REFERER
ALLANIME_API_ENDPOINT,
ALLANIME_BASE,
ALLANIME_REFERER,
USER_AGENT,
)
from .gql_queries import ALLANIME_EPISODES_GQL, ALLANIME_SEARCH_GQL, ALLANIME_SHOW_GQL from .gql_queries import ALLANIME_EPISODES_GQL, ALLANIME_SEARCH_GQL, ALLANIME_SHOW_GQL
if TYPE_CHECKING: if TYPE_CHECKING:
@@ -36,6 +31,9 @@ class AllAnimeAPI(AnimeProvider):
""" """
api_endpoint = ALLANIME_API_ENDPOINT api_endpoint = ALLANIME_API_ENDPOINT
HEADERS = {
"Referer": ALLANIME_REFERER,
}
def _fetch_gql(self, query: str, variables: dict): def _fetch_gql(self, query: str, variables: dict):
"""main abstraction over all requests to the allanime api """main abstraction over all requests to the allanime api
@@ -54,7 +52,6 @@ class AllAnimeAPI(AnimeProvider):
"variables": json.dumps(variables), "variables": json.dumps(variables),
"query": query, "query": query,
}, },
headers={"Referer": ALLANIME_REFERER, "User-Agent": USER_AGENT},
timeout=10, timeout=10,
) )
if response.status_code == 200: if response.status_code == 200:
@@ -231,6 +228,7 @@ class AllAnimeAPI(AnimeProvider):
"server": "Yt", "server": "Yt",
"episode_title": f'{anime["title"]}; Episode {episode_number}', "episode_title": f'{anime["title"]}; Episode {episode_number}',
"headers": {"Referer": f"https://{ALLANIME_BASE}/"}, "headers": {"Referer": f"https://{ALLANIME_BASE}/"},
"subtitles": [],
"links": [ "links": [
{ {
"link": url, "link": url,
@@ -246,10 +244,6 @@ class AllAnimeAPI(AnimeProvider):
) )
resp = self.session.get( resp = self.session.get(
embed_url, embed_url,
headers={
"Referer": ALLANIME_REFERER,
"User-Agent": USER_AGENT,
},
timeout=10, timeout=10,
) )
@@ -260,6 +254,7 @@ class AllAnimeAPI(AnimeProvider):
yield { yield {
"server": "gogoanime", "server": "gogoanime",
"headers": {}, "headers": {},
"subtitles": [],
"episode_title": ( "episode_title": (
allanime_episode["notes"] or f'{anime["title"]}' allanime_episode["notes"] or f'{anime["title"]}'
) )
@@ -271,6 +266,7 @@ class AllAnimeAPI(AnimeProvider):
yield { yield {
"server": "wetransfer", "server": "wetransfer",
"headers": {}, "headers": {},
"subtitles": [],
"episode_title": ( "episode_title": (
allanime_episode["notes"] or f'{anime["title"]}' allanime_episode["notes"] or f'{anime["title"]}'
) )
@@ -282,6 +278,7 @@ class AllAnimeAPI(AnimeProvider):
yield { yield {
"server": "sharepoint", "server": "sharepoint",
"headers": {}, "headers": {},
"subtitles": [],
"episode_title": ( "episode_title": (
allanime_episode["notes"] or f'{anime["title"]}' allanime_episode["notes"] or f'{anime["title"]}'
) )
@@ -293,6 +290,7 @@ class AllAnimeAPI(AnimeProvider):
yield { yield {
"server": "dropbox", "server": "dropbox",
"headers": {}, "headers": {},
"subtitles": [],
"episode_title": ( "episode_title": (
allanime_episode["notes"] or f'{anime["title"]}' allanime_episode["notes"] or f'{anime["title"]}'
) )
@@ -304,6 +302,7 @@ class AllAnimeAPI(AnimeProvider):
yield { yield {
"server": "wixmp", "server": "wixmp",
"headers": {}, "headers": {},
"subtitles": [],
"episode_title": ( "episode_title": (
allanime_episode["notes"] or f'{anime["title"]}' allanime_episode["notes"] or f'{anime["title"]}'
) )
@@ -322,85 +321,3 @@ class AllAnimeAPI(AnimeProvider):
except Exception as e: except Exception as e:
logger.error(f"FA(Allanime): {e}") logger.error(f"FA(Allanime): {e}")
return [] return []
if __name__ == "__main__":
anime_provider = AllAnimeAPI()
# lets see if it works :)
import subprocess
import sys
from InquirerPy import inquirer, validator # pyright:ignore
anime = input("Enter the anime name: ")
translation = input("Enter the translation type: ")
search_results = anime_provider.search_for_anime(
anime, translation_type=translation.strip()
)
if not search_results:
raise Exception("No results found")
search_results = search_results["results"]
options = {show["title"]: show for show in search_results}
anime = inquirer.fuzzy(
"Enter the anime title",
list(options.keys()),
validate=validator.EmptyInputValidator(),
).execute()
if anime is None:
print("No anime was selected")
sys.exit(1)
anime_result = options[anime]
anime_data = anime_provider.get_anime(anime_result["id"])
if not anime_data:
raise Exception("Anime not found")
availableEpisodesDetail = anime_data["availableEpisodesDetail"]
if not availableEpisodesDetail.get(translation.strip()):
raise Exception("No episodes found")
stream_link = True
while stream_link != "quit":
print("select episode")
episode = inquirer.fuzzy(
"Choose an episode",
availableEpisodesDetail[translation.strip()],
validate=validator.EmptyInputValidator(),
).execute()
if episode is None:
print("No episode was selected")
sys.exit(1)
if not anime_data:
print("Sth went wrong")
break
episode_streams_ = anime_provider.get_episode_streams(
anime_data, # pyright: ignore
episode,
translation.strip(),
)
if episode_streams_ is None:
raise Exception("Episode not found")
episode_streams = list(episode_streams_)
stream_links = []
for server in episode_streams:
stream_links.extend([link["link"] for link in server["links"]])
stream_links.append("back")
stream_link = inquirer.fuzzy(
"Choose a link to stream",
stream_links,
validate=validator.EmptyInputValidator(),
).execute()
if stream_link == "quit":
print("Have a nice day")
sys.exit()
if not stream_link:
raise Exception("No stream was selected")
title = episode_streams[0].get(
"episode_title", "%s: Episode %s" % (anime_data["title"], episode)
)
subprocess.run(["mpv", f"--title={title}", stream_link])

View File

@@ -1,7 +1,4 @@
from yt_dlp.utils.networking import random_user_agent SERVERS_AVAILABLE = ["sharepoint", "dropbox", "gogoanime", "weTransfer", "wixmp", "Yt"]
ALLANIME_BASE = "allanime.day" ALLANIME_BASE = "allanime.day"
ALLANIME_REFERER = "https://allanime.to/" ALLANIME_REFERER = "https://allanime.to/"
ALLANIME_API_ENDPOINT = "https://api.{}/api/".format(ALLANIME_BASE) ALLANIME_API_ENDPOINT = "https://api.{}/api/".format(ALLANIME_BASE)
USER_AGENT = random_user_agent()
SERVERS_AVAILABLE = ["sharepoint", "dropbox", "gogoanime", "weTransfer", "wixmp"]

View File

@@ -32,12 +32,14 @@ KWIK_RE = re.compile(r"Player\|(.+?)'")
class AnimePaheApi(AnimeProvider): class AnimePaheApi(AnimeProvider):
search_page: "AnimePaheSearchPage" search_page: "AnimePaheSearchPage"
anime: "AnimePaheAnimePage" anime: "AnimePaheAnimePage"
HEADERS = REQUEST_HEADERS
def search_for_anime(self, user_query: str, *args): def search_for_anime(self, user_query: str, *args):
try: try:
url = f"{ANIMEPAHE_ENDPOINT}m=search&q={user_query}" url = f"{ANIMEPAHE_ENDPOINT}m=search&q={user_query}"
headers = {**REQUEST_HEADERS} response = self.session.get(
response = self.session.get(url, headers=headers) url,
)
if not response.status_code == 200: if not response.status_code == 200:
return return
data: "AnimePaheSearchPage" = response.json() data: "AnimePaheSearchPage" = response.json()
@@ -85,7 +87,9 @@ class AnimePaheApi(AnimeProvider):
url, url,
page, page,
): ):
response = self.session.get(url, headers=REQUEST_HEADERS) response = self.session.get(
url,
)
if response.status_code == 200: if response.status_code == 200:
if not data: if not data:
data.update(response.json()) data.update(response.json())
@@ -136,7 +140,7 @@ class AnimePaheApi(AnimeProvider):
}, },
"episodesInfo": [ "episodesInfo": [
{ {
"title": episode["title"] or f"{title};{episode['episode']}", "title": f"{episode['title'] or title};{episode['episode']}",
"episode": episode["episode"], "episode": episode["episode"],
"id": episode["session"], "id": episode["session"],
"translation_type": episode["audio"], "translation_type": episode["audio"],
@@ -171,7 +175,7 @@ class AnimePaheApi(AnimeProvider):
anime_id = anime["id"] anime_id = anime["id"]
# fetch the episode page # fetch the episode page
url = f"{ANIMEPAHE_BASE}/play/{anime_id}/{episode['session']}" url = f"{ANIMEPAHE_BASE}/play/{anime_id}/{episode['session']}"
response = self.session.get(url, headers=REQUEST_HEADERS) response = self.session.get(url)
# get the element containing links to juicy streams # get the element containing links to juicy streams
c = get_element_by_id("resolutionMenu", response.text) c = get_element_by_id("resolutionMenu", response.text)
resolutionMenuItems = get_elements_html_by_class("dropdown-item", c) resolutionMenuItems = get_elements_html_by_class("dropdown-item", c)
@@ -190,6 +194,7 @@ class AnimePaheApi(AnimeProvider):
"server": "kwik", "server": "kwik",
"links": [], "links": [],
"episode_title": episode_title, "episode_title": episode_title,
"subtitles": [],
"headers": {}, "headers": {},
} }
for res_dict in res_dicts: for res_dict in res_dicts:
@@ -206,7 +211,11 @@ class AnimePaheApi(AnimeProvider):
) )
return [] return []
# get embed page # get embed page
embed_response = self.session.get(embed_url, headers=SERVER_HEADERS) embed_response = self.session.get(
embed_url, headers={"User-Agent": self.USER_AGENT, **SERVER_HEADERS}
)
if not response.status_code == 200:
continue
embed_page = embed_response.text embed_page = embed_response.text
decoded_js = process_animepahe_embed_page(embed_page) decoded_js = process_animepahe_embed_page(embed_page)

View File

@@ -1,18 +1,14 @@
from yt_dlp.utils.networking import random_user_agent
USER_AGENT = random_user_agent()
ANIMEPAHE = "animepahe.ru" ANIMEPAHE = "animepahe.ru"
ANIMEPAHE_BASE = f"https://{ANIMEPAHE}" ANIMEPAHE_BASE = f"https://{ANIMEPAHE}"
ANIMEPAHE_ENDPOINT = f"{ANIMEPAHE_BASE}/api?" ANIMEPAHE_ENDPOINT = f"{ANIMEPAHE_BASE}/api?"
SERVERS_AVAILABLE = ["kwik"]
REQUEST_HEADERS = { REQUEST_HEADERS = {
"Cookie": "__ddgid_=VvX0ebHrH2DsFZo4; __ddgmark_=3savRpSVFhvZcn5x; __ddg2_=buBJ3c4pNBYKFZNp; __ddg1_=rbVADKr9URtt55zoIGFa; SERVERID=janna; XSRF-TOKEN=eyJpdiI6IjV5bFNtd0phUHgvWGJxc25wL0VJSUE9PSIsInZhbHVlIjoicEJTZktlR2hxR2JZTWhnL0JzazlvZU5TQTR2bjBWZ2dDb0RwUXVUUWNSclhQWUhLRStYSmJmWmUxWkpiYkFRYU12RjFWejlSWHorME1wZG5qQ1U0TnFlNnBFR2laQjN1MjdyNjc5TjVPdXdJb2o5VkU1bEduRW9pRHNDTHh6Sy8iLCJtYWMiOiI0OTc0ZmNjY2UwMGJkOWY2MWNkM2NlMjk2ZGMyZGJmMWE0NTdjZTdkNGI2Y2IwNTIzZmFiZWU5ZTE2OTk0YmU4IiwidGFnIjoiIn0%3D; laravel_session=eyJpdiI6ImxvdlpqREFnTjdaeFJubUlXQWlJVWc9PSIsInZhbHVlIjoiQnE4R3VHdjZ4M1NDdEVWM1ZqMUxtNnVERnJCcmtCUHZKNzRPR2RFbzNFcStTL29xdnVTbWhsNVRBUXEybVZWNU1UYVlTazFqYlN5UjJva1k4czNGaXBTbkJJK01oTUd3VHRYVHBoc3dGUWxHYnFlS2NJVVNFbTFqMVBWdFpuVUgiLCJtYWMiOiI1NDdjZTVkYmNhNjUwZTMxZmRlZmVmMmRlMGNiYjAwYjlmYjFjY2U0MDc1YTQzZThiMTIxMjJlYTg1NTA4YjBmIiwidGFnIjoiIn0%3D; latest=5592 ", "Cookie": "__ddgid_=VvX0ebHrH2DsFZo4; __ddgmark_=3savRpSVFhvZcn5x; __ddg2_=buBJ3c4pNBYKFZNp; __ddg1_=rbVADKr9URtt55zoIGFa; SERVERID=janna; XSRF-TOKEN=eyJpdiI6IjV5bFNtd0phUHgvWGJxc25wL0VJSUE9PSIsInZhbHVlIjoicEJTZktlR2hxR2JZTWhnL0JzazlvZU5TQTR2bjBWZ2dDb0RwUXVUUWNSclhQWUhLRStYSmJmWmUxWkpiYkFRYU12RjFWejlSWHorME1wZG5qQ1U0TnFlNnBFR2laQjN1MjdyNjc5TjVPdXdJb2o5VkU1bEduRW9pRHNDTHh6Sy8iLCJtYWMiOiI0OTc0ZmNjY2UwMGJkOWY2MWNkM2NlMjk2ZGMyZGJmMWE0NTdjZTdkNGI2Y2IwNTIzZmFiZWU5ZTE2OTk0YmU4IiwidGFnIjoiIn0%3D; laravel_session=eyJpdiI6ImxvdlpqREFnTjdaeFJubUlXQWlJVWc9PSIsInZhbHVlIjoiQnE4R3VHdjZ4M1NDdEVWM1ZqMUxtNnVERnJCcmtCUHZKNzRPR2RFbzNFcStTL29xdnVTbWhsNVRBUXEybVZWNU1UYVlTazFqYlN5UjJva1k4czNGaXBTbkJJK01oTUd3VHRYVHBoc3dGUWxHYnFlS2NJVVNFbTFqMVBWdFpuVUgiLCJtYWMiOiI1NDdjZTVkYmNhNjUwZTMxZmRlZmVmMmRlMGNiYjAwYjlmYjFjY2U0MDc1YTQzZThiMTIxMjJlYTg1NTA4YjBmIiwidGFnIjoiIn0%3D; latest=5592 ",
"Host": ANIMEPAHE, "Host": ANIMEPAHE,
"User-Agent": USER_AGENT,
"Accept": "application , text/javascript, */*; q=0.01", "Accept": "application , text/javascript, */*; q=0.01",
"Accept-Encoding": "gzip, deflate, br, zstd", "Accept-Encoding": "Utf-8",
"Referer": ANIMEPAHE_BASE, "Referer": ANIMEPAHE_BASE,
"X-Requested-With": "XMLHttpRequest",
"DNT": "1", "DNT": "1",
"Connection": "keep-alive", "Connection": "keep-alive",
"Sec-Fetch-Dest": "empty", "Sec-Fetch-Dest": "empty",
@@ -21,19 +17,17 @@ REQUEST_HEADERS = {
"TE": "trailers", "TE": "trailers",
} }
SERVER_HEADERS = { SERVER_HEADERS = {
"User-Agent": USER_AGENT, "Host": "kwik.si",
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/png,image/svg+xml,*/*;q=0.8", "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/png,image/svg+xml,*/*;q=0.8",
"Accept-Language": "en-US,en;q=0.5", "Accept-Language": "en-US,en;q=0.5",
"Accept-Encoding": "gzip, deflate, br, zstd", "Accept-Encoding": "Utf-8",
"DNT": "1", "DNT": "1",
"Alt-Used": "kwik.si",
"Connection": "keep-alive", "Connection": "keep-alive",
"Referer": ANIMEPAHE_BASE, "Referer": "https://animepahe.ru/",
"Cookie": "kwik_session=eyJpdiI6IlZ5UDd0c0lKTDB1NXlhTHZPeWxFc2c9PSIsInZhbHVlIjoieDJZbGhZUG1QZDNaeWtqR3lwWFNnREdhaHBxNVZRMWNDOHVucGpiMHRJOVdhVmpBc3lpTko1VExRMTFWcE1yUVJtVitoTWdOOU5ObTQ0Q0dHU0MzZU0yRUVvNmtWcUdmY3R4UWx4YklJTmpUL0ZodjhtVEpjWU96cEZoUUhUbVYiLCJtYWMiOiI2OGY2YThkOGU0MTgwOThmYzcyZThmNzFlZjlhMzQzMDgwNjlmMTc4NTIzMzc2YjE3YjNmMWQyNTk4NzczMmZiIiwidGFnIjoiIn0%3D; srv=s0; cf_clearance=QMoZtUpZrX0Mh4XJiFmFSSmoWndISPne5FcsGmKKvTQ-1723297585-1.0.1.1-6tVUnP.aef9XeNj0CnN.19D1el_r53t.lhqddX.J88gohH9UnsPWKeJ4yT0pTbcaGRbPuXTLOS.U72.wdy.gMg",
"Upgrade-Insecure-Requests": "1", "Upgrade-Insecure-Requests": "1",
"Sec-Fetch-Dest": "iframe", "Sec-Fetch-Dest": "iframe",
"Sec-Fetch-Mode": "navigate", "Sec-Fetch-Mode": "navigate",
"Sec-Fetch-Site": "cross-site", "Sec-Fetch-Site": "cross-site",
"Sec-Fetch-User": "?1",
"Priority": "u=4", "Priority": "u=4",
"TE": "trailers",
} }

View File

@@ -0,0 +1,236 @@
import logging
import re
from html.parser import HTMLParser
from itertools import cycle
from urllib.parse import quote_plus
from yt_dlp.utils import (
clean_html,
extract_attributes,
get_element_by_class,
get_element_html_by_class,
get_elements_by_class,
get_elements_html_by_class,
)
from ..base_provider import AnimeProvider
from ..utils import give_random_quality
from .constants import SERVERS_AVAILABLE
from .types import AniWatchStream
logger = logging.getLogger(__name__)
LINK_TO_STREAMS_REGEX = re.compile(r".*://(.*)/embed-(2|4|6)/e-([0-9])/(.*)\?.*")
IMAGE_HTML_ELEMENT_REGEX = re.compile(r"<img.*?>")
class ParseAnchorAndImgTag(HTMLParser):
def __init__(self):
super().__init__()
self.img_tag = None
self.a_tag = None
def handle_starttag(self, tag, attrs):
if tag == "img":
self.img_tag = {attr[0]: attr[1] for attr in attrs}
if tag == "a":
self.a_tag = {attr[0]: attr[1] for attr in attrs}
class AniWatchApi(AnimeProvider):
# HEADERS = {"Referer": "https://hianime.to/home"}
def search_for_anime(self, anime_title: str, *args):
try:
query = quote_plus(anime_title)
url = f"https://hianime.to/search?keyword={query}"
response = self.session.get(url)
if response.status_code != 200:
return
search_page = response.text
search_results_html_items = get_elements_by_class("flw-item", search_page)
results = []
for search_results_html_item in search_results_html_items:
film_poster_html = get_element_by_class(
"film-poster", search_results_html_item
)
if not film_poster_html:
continue
# get availableEpisodes
episodes_html = get_element_html_by_class("tick-sub", film_poster_html)
episodes = clean_html(episodes_html) or 12
# get anime id and poster image url
parser = ParseAnchorAndImgTag()
parser.feed(film_poster_html)
image_data = parser.img_tag
anime_link_data = parser.a_tag
if not image_data or not anime_link_data:
continue
episodes = int(episodes)
# finally!!
image_link = image_data["data-src"]
anime_id = anime_link_data["data-id"]
title = anime_link_data["title"]
results.append(
{
"availableEpisodes": list(range(1, episodes)),
"id": anime_id,
"title": title,
"poster": image_link,
}
)
self.search_results = results
return {"pageInfo": {}, "results": results}
except Exception as e:
logger.error(e)
def get_anime(self, aniwatch_id, *args):
try:
anime_result = {}
for anime in self.search_results:
if anime["id"] == aniwatch_id:
anime_result = anime
break
anime_url = f"https://hianime.to/ajax/v2/episode/list/{aniwatch_id}"
response = self.session.get(anime_url, timeout=10)
if response.status_code == 200:
response_json = response.json()
aniwatch_anime_page = response_json["html"]
episodes_info_container_html = get_element_html_by_class(
"ss-list", aniwatch_anime_page
)
episodes_info_html_list = get_elements_html_by_class(
"ep-item", episodes_info_container_html
)
# keys: [ data-number: episode_number, data-id: episode_id, title: episode_title , href:episode_page_url]
episodes_info_dicts = [
extract_attributes(episode_dict)
for episode_dict in episodes_info_html_list
]
episodes = [episode["data-number"] for episode in episodes_info_dicts]
self.episodes_info = [
{
"id": episode["data-id"],
"title": (
(episode["title"] or "").replace(
f"Episode {episode['data-number']}", ""
)
or anime_result["title"]
)
+ f"; Episode {episode['data-number']}",
"episode": episode["data-number"],
}
for episode in episodes_info_dicts
]
return {
"id": aniwatch_id,
"availableEpisodesDetail": {
"dub": episodes,
"sub": episodes,
"raw": episodes,
},
"poster": anime_result["poster"],
"title": anime_result["title"],
"episodes_info": self.episodes_info,
}
except Exception as e:
logger.error(e)
def get_episode_streams(self, anime, episode, translation_type, *args):
try:
episode_details = [
episode_details
for episode_details in self.episodes_info
if episode_details["episode"] == episode
]
if not episode_details:
return
episode_details = episode_details[0]
episode_url = f"https://hianime.to/ajax/v2/episode/servers?episodeId={episode_details['id']}"
response = self.session.get(episode_url)
if response.status_code == 200:
response_json = response.json()
episode_page_html = response_json["html"]
servers_containers_html = get_elements_html_by_class(
"ps__-list", episode_page_html
)
if not servers_containers_html:
return
# sub servers
try:
servers_html_sub = get_elements_html_by_class(
"server-item", servers_containers_html[0]
)
except Exception:
logger.warn("AniWatch: sub not found")
servers_html_sub = None
# dub servers
try:
servers_html_dub = get_elements_html_by_class(
"server-item", servers_containers_html[1]
)
except Exception:
logger.warn("AniWatch: dub not found")
servers_html_dub = None
if translation_type == "dub":
servers_html = servers_html_dub
else:
servers_html = servers_html_sub
if not servers_html:
return
for server_name, server_html in zip(
cycle(SERVERS_AVAILABLE), servers_html
):
try:
# keys: [ data-type: translation_type, data-id: embed_id, data-server-id: server_id ]
servers_info = extract_attributes(server_html)
embed_url = f"https://hianime.to/ajax/v2/episode/sources?id={servers_info['data-id']}"
embed_response = self.session.get(embed_url)
if embed_response.status_code == 200:
embed_json = embed_response.json()
raw_link_to_streams = embed_json["link"]
match = LINK_TO_STREAMS_REGEX.match(raw_link_to_streams)
if not match:
continue
provider_domain = match.group(1)
embed_type = match.group(2)
episode_number = match.group(3)
source_id = match.group(4)
link_to_streams = f"https://{provider_domain}/embed-{embed_type}/ajax/e-{episode_number}/getSources?id={source_id}"
link_to_streams_response = self.session.get(link_to_streams)
if link_to_streams_response.status_code == 200:
juicy_streams_json: "AniWatchStream" = (
link_to_streams_response.json()
)
yield {
"headers": {},
"subtitles": [
{
"url": track["file"],
"language": track["label"],
}
for track in juicy_streams_json["tracks"]
if track["kind"] == "captions"
],
"server": server_name,
"episode_title": episode_details["title"],
"links": give_random_quality(
[
{"link": link["file"], "type": link["type"]}
for link in juicy_streams_json["sources"]
]
),
}
except Exception as e:
logger.error(e)
except Exception as e:
logger.error(e)

View File

@@ -0,0 +1 @@
SERVERS_AVAILABLE = ["HD1", "HD2", "StreamSB", "StreamTape"]

View File

@@ -0,0 +1,26 @@
from typing import Literal, TypedDict
class AniWatchSkipTime(TypedDict):
start: int
end: int
class AniWatchSource(TypedDict):
file: str
type: str
class AniWatchTrack(TypedDict):
file: str
label: str
kind: Literal["captions", "thumbnails", "audio"]
class AniWatchStream(TypedDict):
sources: list[AniWatchSource]
tracks: list[AniWatchTrack]
encrypted: bool
intro: AniWatchSkipTime
outro: AniWatchSkipTime
server: int

View File

@@ -0,0 +1,65 @@
from html.parser import HTMLParser
from yt_dlp.utils import clean_html, get_element_by_class, get_elements_by_class
from ..base_provider import AnimeProvider
from .constants import ANIWAVE_BASE, SEARCH_HEADERS
class ParseAnchorAndImgTag(HTMLParser):
def __init__(self):
super().__init__()
self.img_tag = None
self.a_tag = None
def handle_starttag(self, tag, attrs):
if tag == "img":
self.img_tag = {attr[0]: attr[1] for attr in attrs}
if tag == "a":
self.a_tag = {attr[0]: attr[1] for attr in attrs}
class AniWaveApi(AnimeProvider):
def search_for_anime(self, anime_title, *args):
self.session.headers.update(SEARCH_HEADERS)
search_url = f"{ANIWAVE_BASE}/filter"
params = {"keyword": anime_title}
res = self.session.get(search_url, params=params)
search_page = res.text
search_results_html_list = get_elements_by_class("item", search_page)
results = []
for result_html in search_results_html_list:
aniposter_html = get_element_by_class("poster", result_html)
episode_html = get_element_by_class("sub", aniposter_html)
episodes = clean_html(episode_html) or 12
if not aniposter_html:
return
parser = ParseAnchorAndImgTag()
parser.feed(aniposter_html)
image_data = parser.img_tag
anime_link_data = parser.a_tag
if not image_data or not anime_link_data:
continue
episodes = int(episodes)
# finally!!
image_link = image_data["src"]
title = image_data["alt"]
anime_id = anime_link_data["href"]
results.append(
{
"availableEpisodes": list(range(1, episodes)),
"id": anime_id,
"title": title,
"poster": image_link,
}
)
self.search_results = results
return {"pageInfo": {}, "results": results}
def get_anime(self, anime_id, *args):
anime_page_url = f"{ANIWAVE_BASE}{anime_id}"
self.session.get(anime_page_url)
# TODO: to be continued; mostly js so very difficult

View File

@@ -0,0 +1,20 @@
ANIWAVE_BASE = "https://aniwave.to"
SEARCH_HEADERS = {
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/png,image/svg+xml,*/*;q=0.8",
"Accept-Language": "en-US,en;q=0.5",
# 'Accept-Encoding': 'Utf-8',
"Referer": "https://aniwave.to/filter",
"DNT": "1",
"Upgrade-Insecure-Requests": "1",
"Sec-Fetch-Dest": "document",
"Sec-Fetch-Mode": "navigate",
"Sec-Fetch-Site": "same-origin",
"Sec-Fetch-User": "?1",
"Connection": "keep-alive",
"Alt-Used": "aniwave.to",
# 'Cookie': '__pf=1; usertype=guest; session=BElk9DJdO3sFdDmLiGxuNiM9eGYO1TjktGsmdwjV',
"Priority": "u=0, i",
# Requests doesn't support trailers
# 'TE': 'trailers',
}

View File

@@ -1,8 +1,13 @@
import requests import requests
from yt_dlp.utils.networking import random_user_agent
class AnimeProvider: class AnimeProvider:
session: requests.Session session: requests.Session
USER_AGENT = random_user_agent()
HEADERS = {}
def __init__(self) -> None: def __init__(self) -> None:
self.session = requests.session() self.session = requests.session()
self.session.headers.update({"User-Agent": self.USER_AGENT, **self.HEADERS})

View File

@@ -0,0 +1,15 @@
import logging
from requests import get
logger = logging.getLogger(__name__)
def fetch_anime_info_from_bal(anilist_id):
try:
url = f"https://raw.githubusercontent.com/bal-mackup/mal-backup/master/anilist/anime/{anilist_id}.json"
response = get(url, timeout=11)
if response.status_code == 200:
return response.json()
except Exception as e:
logger.error(e)

View File

@@ -0,0 +1,153 @@
import logging
from typing import TYPE_CHECKING
from requests import post
from thefuzz import fuzz
if TYPE_CHECKING:
from ..anilist.types import AnilistDataSchema
logger = logging.getLogger(__name__)
ANILIST_ENDPOINT = "https://graphql.anilist.co"
"""
query($query:String){
Page(perPage:50){
pageInfo{
total
currentPage
hasNextPage
}
media(search:$query,type:ANIME){
id
idMal
title{
romaji
english
}
episodes
status
nextAiringEpisode {
timeUntilAiring
airingAt
episode
}
}
}
}
"""
def search_for_anime_with_anilist(anime_title: str):
query = """
query($query:String){
Page(perPage:50){
pageInfo{
total
currentPage
hasNextPage
}
media(search:$query,type:ANIME){
id
idMal
title{
romaji
english
}
episodes
status
nextAiringEpisode {
timeUntilAiring
airingAt
episode
}
}
}
}
"""
response = post(
ANILIST_ENDPOINT,
json={"query": query, "variables": {"query": anime_title}},
timeout=10,
)
if response.status_code == 200:
anilist_data: "AnilistDataSchema" = response.json()
return {
"pageInfo": anilist_data["data"]["Page"]["pageInfo"],
"results": [
{
"id": anime_result["id"],
"title": anime_result["title"]["romaji"]
or anime_result["title"]["english"],
"type": "anime",
"availableEpisodes": list(
range(
1,
(
anime_result["episodes"]
if not anime_result["status"] == "RELEASING"
and anime_result["episodes"]
else (
anime_result["nextAiringEpisode"]["episode"] - 1
if anime_result["nextAiringEpisode"]
else 0
)
),
)
),
}
for anime_result in anilist_data["data"]["Page"]["media"]
],
}
def get_mal_id_and_anilist_id(anime_title: str) -> "dict[str,int] | None":
"""the abstraction over all none authenticated requests and that returns data of a similar type
Args:
query: the anilist query
variables: the anilist api variables
Returns:
a boolean indicating success and none or an anilist object depending on success
"""
query = """
query($query:String){
Page(perPage:50){
pageInfo{
total
currentPage
hasNextPage
}
media(search:$query,type:ANIME){
id
idMal
title{
romaji
english
}
}
}
}
"""
try:
variables = {"query": anime_title}
response = post(
ANILIST_ENDPOINT,
json={"query": query, "variables": variables},
timeout=10,
)
anilist_data: "AnilistDataSchema" = response.json()
if response.status_code == 200:
anime = max(
anilist_data["data"]["Page"]["media"],
key=lambda anime: max(
(
fuzz.ratio(anime, str(anime["title"]["romaji"])),
fuzz.ratio(anime_title, str(anime["title"]["english"])),
)
),
)
return {"id_anilist": anime["id"], "id_mal": anime["idMal"]}
except Exception as e:
logger.error(f"Something unexpected occured {e}")

View File

@@ -39,9 +39,20 @@ class AnimeEpisodeDetails(TypedDict):
raw: list[str] raw: list[str]
class AnimeEpisode(TypedDict): #
# class AnimeEpisode(TypedDict):
# id: str
# title: str
#
class AnimeEpisodeInfo(TypedDict):
id: str id: str
title: str title: str
episode: str
poster: str | None
duration: str | None
translation_type: str | None
class Anime(TypedDict): class Anime(TypedDict):
@@ -49,7 +60,7 @@ class Anime(TypedDict):
title: str title: str
availableEpisodesDetail: AnimeEpisodeDetails availableEpisodesDetail: AnimeEpisodeDetails
type: str | None type: str | None
episodesInfo: list[AnimeEpisode] | None episodesInfo: list[AnimeEpisodeInfo] | None
poster: str poster: str
year: str year: str
@@ -64,8 +75,15 @@ class EpisodeStream(TypedDict):
translation_type: Literal["dub", "sub"] translation_type: Literal["dub", "sub"]
class Subtitle(TypedDict):
url: str
language: str
class Server(TypedDict): class Server(TypedDict):
headers: dict headers: dict
subtitles: list[Subtitle]
audio: list
server: str server: str
episode_title: str episode_title: str
links: list[EpisodeStream] links: list[EpisodeStream]

View File

@@ -35,12 +35,12 @@ hex_to_char = {
} }
def give_random_quality(links: list[dict]): def give_random_quality(links):
qualities = cycle(["1080", "720", "480", "360"]) qualities = cycle(["1080", "720", "480", "360"])
return [ return [
{"link": link["link"], "quality": quality} {**episode_stream, "quality": quality}
for link, quality in zip(links, qualities) for episode_stream, quality in zip(links, qualities)
] ]

View File

@@ -0,0 +1,223 @@
import logging
from typing import TYPE_CHECKING
from requests import post
from thefuzz import fuzz
if TYPE_CHECKING:
from ..anilist.types import AnilistDataSchema
logger = logging.getLogger(__name__)
ANILIST_ENDPOINT = "https://graphql.anilist.co"
"""
query($query:String){
Page(perPage:50){
pageInfo{
total
currentPage
hasNextPage
}
media(search:$query,type:ANIME){
id
idMal
title{
romaji
english
}
episodes
status
nextAiringEpisode {
timeUntilAiring
airingAt
episode
}
}
}
}
"""
def search_foranime_with_anilist(anime_title: str):
query = """
query($query:String){
Page(perPage:50){
pageInfo{
total
currentPage
hasNextPage
}
media(search:$query,type:ANIME){
id
idMal
title{
romaji
english
}
episodes
status
nextAiringEpisode {
timeUntilAiring
airingAt
episode
}
}
}
}
"""
response = post(
ANILIST_ENDPOINT,
json={"query": query, "variables": {"query": anime_title}},
timeout=10,
)
if response.status_code == 200:
anilist_data: "AnilistDataSchema" = response.json()
return {
"pageInfo": anilist_data["data"]["Page"]["pageInfo"],
"results": [
{
"id": anime_result["id"],
"title": anime_result["title"]["romaji"]
or anime_result["title"]["english"],
"type": "anime",
"availableEpisodes": list(
range(
1,
(
anime_result["episodes"]
if not anime_result["status"] == "RELEASING"
and anime_result["episodes"]
else (
anime_result["nextAiringEpisode"]["episode"] - 1
if anime_result["nextAiringEpisode"]
else 0
)
),
)
),
}
for anime_result in anilist_data["data"]["Page"]["media"]
],
}
def get_mal_id_and_anilist_id(anime_title: str) -> "dict[str,int] | None":
"""the abstraction over all none authenticated requests and that returns data of a similar type
Args:
query: the anilist query
variables: the anilist api variables
Returns:
a boolean indicating success and none or an anilist object depending on success
"""
query = """
query($query:String){
Page(perPage:50){
pageInfo{
total
currentPage
hasNextPage
}
media(search:$query,type:ANIME){
id
idMal
title{
romaji
english
}
}
}
}
"""
try:
variables = {"query": anime_title}
response = post(
ANILIST_ENDPOINT,
json={"query": query, "variables": variables},
timeout=10,
)
anilist_data: "AnilistDataSchema" = response.json()
if response.status_code == 200:
anime = max(
anilist_data["data"]["Page"]["media"],
key=lambda anime: max(
(
fuzz.ratio(anime, str(anime["title"]["romaji"])),
fuzz.ratio(anime_title, str(anime["title"]["english"])),
)
),
)
return {"id_anilist": anime["id"], "id_mal": anime["idMal"]}
except Exception as e:
logger.error(f"Something unexpected occured {e}")
def get_basic_anime_info_by_title(anime_title: str):
"""the abstraction over all none authenticated requests and that returns data of a similar type
Args:
query: the anilist query
variables: the anilist api variables
Returns:
a boolean indicating success and none or an anilist object depending on success
"""
query = """
query($query:String){
Page(perPage:50){
pageInfo{
total
}
media(search:$query,type:ANIME){
id
idMal
title{
romaji
english
}
streamingEpisodes{
title
}
}
}
}
"""
from ...Utility.data import anime_normalizer
# normalize the title
anime_title = anime_normalizer.get(anime_title, anime_title)
try:
variables = {"query": anime_title}
response = post(
ANILIST_ENDPOINT,
json={"query": query, "variables": variables},
timeout=10,
)
anilist_data: "AnilistDataSchema" = response.json()
if response.status_code == 200:
anime = max(
anilist_data["data"]["Page"]["media"],
key=lambda anime: max(
(
fuzz.ratio(anime_title, str(anime["title"]["romaji"])),
fuzz.ratio(anime_title, str(anime["title"]["english"])),
)
),
)
return {
"idAilist": anime["id"],
"idMal": anime["idMal"],
"title": {
"english": anime["title"]["english"],
"romaji": anime["title"]["romaji"],
},
"episodes": [
{"title": episode["title"]}
for episode in anime["streamingEpisodes"]
if episode
],
}
except Exception as e:
logger.error(f"Something unexpected occured {e}")

172
poetry.lock generated
View File

@@ -194,13 +194,13 @@ cffi = ">=1.0.0"
[[package]] [[package]]
name = "cachetools" name = "cachetools"
version = "5.4.0" version = "5.5.0"
description = "Extensible memoizing collections and decorators" description = "Extensible memoizing collections and decorators"
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.7"
files = [ files = [
{file = "cachetools-5.4.0-py3-none-any.whl", hash = "sha256:3ae3b49a3d5e28a77a0be2b37dbcb89005058959cb2323858c2657c4a8cab474"}, {file = "cachetools-5.5.0-py3-none-any.whl", hash = "sha256:02134e8439cdc2ffb62023ce1debca2944c3f289d66bb17ead3ab3dede74b292"},
{file = "cachetools-5.4.0.tar.gz", hash = "sha256:b8adc2e7c07f105ced7bc56dbb6dfbe7c4a00acce20e2227b3f355be89bc6827"}, {file = "cachetools-5.5.0.tar.gz", hash = "sha256:2cc24fb4cbe39633fb7badd9db9ca6295d766d9c2995f245725a46715d050f2a"},
] ]
[[package]] [[package]]
@@ -845,13 +845,13 @@ testing = ["covdefaults (>=2.3)", "pytest (>=8.2.2)", "pytest-cov (>=5)", "pytes
[[package]] [[package]]
name = "pyright" name = "pyright"
version = "1.1.376" version = "1.1.377"
description = "Command line wrapper for pyright" description = "Command line wrapper for pyright"
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.7"
files = [ files = [
{file = "pyright-1.1.376-py3-none-any.whl", hash = "sha256:0f2473b12c15c46b3207f0eec224c3cea2bdc07cd45dd4a037687cbbca0fbeff"}, {file = "pyright-1.1.377-py3-none-any.whl", hash = "sha256:af0dd2b6b636c383a6569a083f8c5a8748ae4dcde5df7914b3f3f267e14dd162"},
{file = "pyright-1.1.376.tar.gz", hash = "sha256:bffd63b197cd0810395bb3245c06b01f95a85ddf6bfa0e5644ed69c841e954dd"}, {file = "pyright-1.1.377.tar.gz", hash = "sha256:aabc30fedce0ded34baa0c49b24f10e68f4bfc8f68ae7f3d175c4b0f256b4fcf"},
] ]
[package.dependencies] [package.dependencies]
@@ -1243,83 +1243,97 @@ files = [
[[package]] [[package]]
name = "websockets" name = "websockets"
version = "12.0" version = "13.0"
description = "An implementation of the WebSocket Protocol (RFC 6455 & 7692)" description = "An implementation of the WebSocket Protocol (RFC 6455 & 7692)"
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
files = [ files = [
{file = "websockets-12.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d554236b2a2006e0ce16315c16eaa0d628dab009c33b63ea03f41c6107958374"}, {file = "websockets-13.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:ad4fa707ff9e2ffee019e946257b5300a45137a58f41fbd9a4db8e684ab61528"},
{file = "websockets-12.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2d225bb6886591b1746b17c0573e29804619c8f755b5598d875bb4235ea639be"}, {file = "websockets-13.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6fd757f313c13c34dae9f126d3ba4cf97175859c719e57c6a614b781c86b617e"},
{file = "websockets-12.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:eb809e816916a3b210bed3c82fb88eaf16e8afcf9c115ebb2bacede1797d2547"}, {file = "websockets-13.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:cbac2eb7ce0fac755fb983c9247c4a60c4019bcde4c0e4d167aeb17520cc7ef1"},
{file = "websockets-12.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c588f6abc13f78a67044c6b1273a99e1cf31038ad51815b3b016ce699f0d75c2"}, {file = "websockets-13.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d4b83cf7354cbbc058e97b3e545dceb75b8d9cf17fd5a19db419c319ddbaaf7a"},
{file = "websockets-12.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5aa9348186d79a5f232115ed3fa9020eab66d6c3437d72f9d2c8ac0c6858c558"}, {file = "websockets-13.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9202c0010c78fad1041e1c5285232b6508d3633f92825687549540a70e9e5901"},
{file = "websockets-12.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6350b14a40c95ddd53e775dbdbbbc59b124a5c8ecd6fbb09c2e52029f7a9f480"}, {file = "websockets-13.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3e6566e79c8c7cbea75ec450f6e1828945fc5c9a4769ceb1c7b6e22470539712"},
{file = "websockets-12.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:70ec754cc2a769bcd218ed8d7209055667b30860ffecb8633a834dde27d6307c"}, {file = "websockets-13.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:e7fcad070dcd9ad37a09d89a4cbc2a5e3e45080b88977c0da87b3090f9f55ead"},
{file = "websockets-12.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6e96f5ed1b83a8ddb07909b45bd94833b0710f738115751cdaa9da1fb0cb66e8"}, {file = "websockets-13.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:0a8f7d65358a25172db00c69bcc7df834155ee24229f560d035758fd6613111a"},
{file = "websockets-12.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:4d87be612cbef86f994178d5186add3d94e9f31cc3cb499a0482b866ec477603"}, {file = "websockets-13.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:63b702fb31e3f058f946ccdfa551f4d57a06f7729c369e8815eb18643099db37"},
{file = "websockets-12.0-cp310-cp310-win32.whl", hash = "sha256:befe90632d66caaf72e8b2ed4d7f02b348913813c8b0a32fae1cc5fe3730902f"}, {file = "websockets-13.0-cp310-cp310-win32.whl", hash = "sha256:3a20cf14ba7b482c4a1924b5e061729afb89c890ca9ed44ac4127c6c5986e424"},
{file = "websockets-12.0-cp310-cp310-win_amd64.whl", hash = "sha256:363f57ca8bc8576195d0540c648aa58ac18cf85b76ad5202b9f976918f4219cf"}, {file = "websockets-13.0-cp310-cp310-win_amd64.whl", hash = "sha256:587245f0704d0bb675f919898d7473e8827a6d578e5a122a21756ca44b811ec8"},
{file = "websockets-12.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5d873c7de42dea355d73f170be0f23788cf3fa9f7bed718fd2830eefedce01b4"}, {file = "websockets-13.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:06df8306c241c235075d2ae77367038e701e53bc8c1bb4f6644f4f53aa6dedd0"},
{file = "websockets-12.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3f61726cae9f65b872502ff3c1496abc93ffbe31b278455c418492016e2afc8f"}, {file = "websockets-13.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:85a1f92a02f0b8c1bf02699731a70a8a74402bb3f82bee36e7768b19a8ed9709"},
{file = "websockets-12.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:ed2fcf7a07334c77fc8a230755c2209223a7cc44fc27597729b8ef5425aa61a3"}, {file = "websockets-13.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9ed02c604349068d46d87ef4c2012c112c791f2bec08671903a6bb2bd9c06784"},
{file = "websockets-12.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e332c210b14b57904869ca9f9bf4ca32f5427a03eeb625da9b616c85a3a506c"}, {file = "websockets-13.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b89849171b590107f6724a7b0790736daead40926ddf47eadf998b4ff51d6414"},
{file = "websockets-12.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5693ef74233122f8ebab026817b1b37fe25c411ecfca084b29bc7d6efc548f45"}, {file = "websockets-13.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:939a16849d71203628157a5e4a495da63967c744e1e32018e9b9e2689aca64d4"},
{file = "websockets-12.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e9e7db18b4539a29cc5ad8c8b252738a30e2b13f033c2d6e9d0549b45841c04"}, {file = "websockets-13.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ad818cdac37c0ad4c58e51cb4964eae4f18b43c4a83cb37170b0d90c31bd80cf"},
{file = "websockets-12.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:6e2df67b8014767d0f785baa98393725739287684b9f8d8a1001eb2839031447"}, {file = "websockets-13.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:cbfe82a07596a044de78bb7a62519e71690c5812c26c5f1d4b877e64e4f46309"},
{file = "websockets-12.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:bea88d71630c5900690fcb03161ab18f8f244805c59e2e0dc4ffadae0a7ee0ca"}, {file = "websockets-13.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e07e76c49f39c5b45cbd7362b94f001ae209a3ea4905ae9a09cfd53b3c76373d"},
{file = "websockets-12.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:dff6cdf35e31d1315790149fee351f9e52978130cef6c87c4b6c9b3baf78bc53"}, {file = "websockets-13.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:372f46a0096cfda23c88f7e42349a33f8375e10912f712e6b496d3a9a557290f"},
{file = "websockets-12.0-cp311-cp311-win32.whl", hash = "sha256:3e3aa8c468af01d70332a382350ee95f6986db479ce7af14d5e81ec52aa2b402"}, {file = "websockets-13.0-cp311-cp311-win32.whl", hash = "sha256:376a43a4fd96725f13450d3d2e98f4f36c3525c562ab53d9a98dd2950dca9a8a"},
{file = "websockets-12.0-cp311-cp311-win_amd64.whl", hash = "sha256:25eb766c8ad27da0f79420b2af4b85d29914ba0edf69f547cc4f06ca6f1d403b"}, {file = "websockets-13.0-cp311-cp311-win_amd64.whl", hash = "sha256:2be1382a4daa61e2f3e2be3b3c86932a8db9d1f85297feb6e9df22f391f94452"},
{file = "websockets-12.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0e6e2711d5a8e6e482cacb927a49a3d432345dfe7dea8ace7b5790df5932e4df"}, {file = "websockets-13.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:b5407c34776b9b77bd89a5f95eb0a34aaf91889e3f911c63f13035220eb50107"},
{file = "websockets-12.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:dbcf72a37f0b3316e993e13ecf32f10c0e1259c28ffd0a85cee26e8549595fbc"}, {file = "websockets-13.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:4782ec789f059f888c1e8fdf94383d0e64b531cffebbf26dd55afd53ab487ca4"},
{file = "websockets-12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:12743ab88ab2af1d17dd4acb4645677cb7063ef4db93abffbf164218a5d54c6b"}, {file = "websockets-13.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:c8feb8e19ef65c9994e652c5b0324abd657bedd0abeb946fb4f5163012c1e730"},
{file = "websockets-12.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7b645f491f3c48d3f8a00d1fce07445fab7347fec54a3e65f0725d730d5b99cb"}, {file = "websockets-13.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d3f3d2e20c442b58dbac593cb1e02bc02d149a86056cc4126d977ad902472e3b"},
{file = "websockets-12.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9893d1aa45a7f8b3bc4510f6ccf8db8c3b62120917af15e3de247f0780294b92"}, {file = "websockets-13.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e39d393e0ab5b8bd01717cc26f2922026050188947ff54fe6a49dc489f7750b7"},
{file = "websockets-12.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1f38a7b376117ef7aff996e737583172bdf535932c9ca021746573bce40165ed"}, {file = "websockets-13.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1f661a4205741bdc88ac9c2b2ec003c72cee97e4acd156eb733662ff004ba429"},
{file = "websockets-12.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:f764ba54e33daf20e167915edc443b6f88956f37fb606449b4a5b10ba42235a5"}, {file = "websockets-13.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:384129ad0490e06bab2b98c1da9b488acb35bb11e2464c728376c6f55f0d45f3"},
{file = "websockets-12.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:1e4b3f8ea6a9cfa8be8484c9221ec0257508e3a1ec43c36acdefb2a9c3b00aa2"}, {file = "websockets-13.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:df5c0eff91f61b8205a6c9f7b255ff390cdb77b61c7b41f79ca10afcbb22b6cb"},
{file = "websockets-12.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9fdf06fd06c32205a07e47328ab49c40fc1407cdec801d698a7c41167ea45113"}, {file = "websockets-13.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:02cc9bb1a887dac0e08bf657c5d00aa3fac0d03215d35a599130c2034ae6663a"},
{file = "websockets-12.0-cp312-cp312-win32.whl", hash = "sha256:baa386875b70cbd81798fa9f71be689c1bf484f65fd6fb08d051a0ee4e79924d"}, {file = "websockets-13.0-cp312-cp312-win32.whl", hash = "sha256:d9726d2c9bd6aed8cb994d89b3910ca0079406edce3670886ec828a73e7bdd53"},
{file = "websockets-12.0-cp312-cp312-win_amd64.whl", hash = "sha256:ae0a5da8f35a5be197f328d4727dbcfafa53d1824fac3d96cdd3a642fe09394f"}, {file = "websockets-13.0-cp312-cp312-win_amd64.whl", hash = "sha256:fa0839f35322f7b038d8adcf679e2698c3a483688cc92e3bd15ee4fb06669e9a"},
{file = "websockets-12.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5f6ffe2c6598f7f7207eef9a1228b6f5c818f9f4d53ee920aacd35cec8110438"}, {file = "websockets-13.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:da7e501e59857e8e3e9d10586139dc196b80445a591451ca9998aafba1af5278"},
{file = "websockets-12.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:9edf3fc590cc2ec20dc9d7a45108b5bbaf21c0d89f9fd3fd1685e223771dc0b2"}, {file = "websockets-13.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:a00e1e587c655749afb5b135d8d3edcfe84ec6db864201e40a882e64168610b3"},
{file = "websockets-12.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:8572132c7be52632201a35f5e08348137f658e5ffd21f51f94572ca6c05ea81d"}, {file = "websockets-13.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a7fbf2a8fe7556a8f4e68cb3e736884af7bf93653e79f6219f17ebb75e97d8f0"},
{file = "websockets-12.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:604428d1b87edbf02b233e2c207d7d528460fa978f9e391bd8aaf9c8311de137"}, {file = "websockets-13.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7ea9c9c7443a97ea4d84d3e4d42d0e8c4235834edae652993abcd2aff94affd7"},
{file = "websockets-12.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1a9d160fd080c6285e202327aba140fc9a0d910b09e423afff4ae5cbbf1c7205"}, {file = "websockets-13.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:35c2221b539b360203f3f9ad168e527bf16d903e385068ae842c186efb13d0ea"},
{file = "websockets-12.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87b4aafed34653e465eb77b7c93ef058516cb5acf3eb21e42f33928616172def"}, {file = "websockets-13.0-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:358d37c5c431dd050ffb06b4b075505aae3f4f795d7fff9794e5ed96ce99b998"},
{file = "websockets-12.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:b2ee7288b85959797970114deae81ab41b731f19ebcd3bd499ae9ca0e3f1d2c8"}, {file = "websockets-13.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:038e7a0f1bfafc7bf52915ab3506b7a03d1e06381e9f60440c856e8918138151"},
{file = "websockets-12.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:7fa3d25e81bfe6a89718e9791128398a50dec6d57faf23770787ff441d851967"}, {file = "websockets-13.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:fd038bc9e2c134847f1e0ce3191797fad110756e690c2fdd9702ed34e7a43abb"},
{file = "websockets-12.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:a571f035a47212288e3b3519944f6bf4ac7bc7553243e41eac50dd48552b6df7"}, {file = "websockets-13.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:93b8c2008f372379fb6e5d2b3f7c9ec32f7b80316543fd3a5ace6610c5cde1b0"},
{file = "websockets-12.0-cp38-cp38-win32.whl", hash = "sha256:3c6cc1360c10c17463aadd29dd3af332d4a1adaa8796f6b0e9f9df1fdb0bad62"}, {file = "websockets-13.0-cp313-cp313-win32.whl", hash = "sha256:851fd0afb3bc0b73f7c5b5858975d42769a5fdde5314f4ef2c106aec63100687"},
{file = "websockets-12.0-cp38-cp38-win_amd64.whl", hash = "sha256:1bf386089178ea69d720f8db6199a0504a406209a0fc23e603b27b300fdd6892"}, {file = "websockets-13.0-cp313-cp313-win_amd64.whl", hash = "sha256:7d14901fdcf212804970c30ab9ee8f3f0212e620c7ea93079d6534863444fb4e"},
{file = "websockets-12.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:ab3d732ad50a4fbd04a4490ef08acd0517b6ae6b77eb967251f4c263011a990d"}, {file = "websockets-13.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:ae7a519a56a714f64c3445cabde9fc2fc927e7eae44f413eae187cddd9e54178"},
{file = "websockets-12.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:a1d9697f3337a89691e3bd8dc56dea45a6f6d975f92e7d5f773bc715c15dde28"}, {file = "websockets-13.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:5575031472ca87302aeb2ce2c2349f4c6ea978c86a9d1289bc5d16058ad4c10a"},
{file = "websockets-12.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:1df2fbd2c8a98d38a66f5238484405b8d1d16f929bb7a33ed73e4801222a6f53"}, {file = "websockets-13.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:9895df6cd0bfe79d09bcd1dbdc03862846f26fbd93797153de954306620c1d00"},
{file = "websockets-12.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23509452b3bc38e3a057382c2e941d5ac2e01e251acce7adc74011d7d8de434c"}, {file = "websockets-13.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a4de299c947a54fca9ce1c5fd4a08eb92ffce91961becb13bd9195f7c6e71b47"},
{file = "websockets-12.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2e5fc14ec6ea568200ea4ef46545073da81900a2b67b3e666f04adf53ad452ec"}, {file = "websockets-13.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:05c25f7b849702950b6fd0e233989bb73a0d2bc83faa3b7233313ca395205f6d"},
{file = "websockets-12.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:46e71dbbd12850224243f5d2aeec90f0aaa0f2dde5aeeb8fc8df21e04d99eff9"}, {file = "websockets-13.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ede95125a30602b1691a4b1da88946bf27dae283cf30f22cd2cb8ca4b2e0d119"},
{file = "websockets-12.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b81f90dcc6c85a9b7f29873beb56c94c85d6f0dac2ea8b60d995bd18bf3e2aae"}, {file = "websockets-13.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:addf0a16e4983280efed272d8cb3b2e05f0051755372461e7d966b80a6554e16"},
{file = "websockets-12.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:a02413bc474feda2849c59ed2dfb2cddb4cd3d2f03a2fedec51d6e959d9b608b"}, {file = "websockets-13.0-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:06b3186e97bf9a33921fa60734d5ed90f2a9b407cce8d23c7333a0984049ef61"},
{file = "websockets-12.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:bbe6013f9f791944ed31ca08b077e26249309639313fff132bfbf3ba105673b9"}, {file = "websockets-13.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:eae368cac85adc4c7dc3b0d5f84ffcca609d658db6447387300478e44db70796"},
{file = "websockets-12.0-cp39-cp39-win32.whl", hash = "sha256:cbe83a6bbdf207ff0541de01e11904827540aa069293696dd528a6640bd6a5f6"}, {file = "websockets-13.0-cp38-cp38-win32.whl", hash = "sha256:337837ac788d955728b1ab01876d72b73da59819a3388e1c5e8e05c3999f1afa"},
{file = "websockets-12.0-cp39-cp39-win_amd64.whl", hash = "sha256:fc4e7fa5414512b481a2483775a8e8be7803a35b30ca805afa4998a84f9fd9e8"}, {file = "websockets-13.0-cp38-cp38-win_amd64.whl", hash = "sha256:f66e00e42f25ca7e91076366303e11c82572ca87cc5aae51e6e9c094f315ab41"},
{file = "websockets-12.0-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:248d8e2446e13c1d4326e0a6a4e9629cb13a11195051a73acf414812700badbd"}, {file = "websockets-13.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:94c1c02721139fe9940b38d28fb15b4b782981d800d5f40f9966264fbf23dcc8"},
{file = "websockets-12.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f44069528d45a933997a6fef143030d8ca8042f0dfaad753e2906398290e2870"}, {file = "websockets-13.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:bd4ba86513430513e2aa25a441bb538f6f83734dc368a2c5d18afdd39097aa33"},
{file = "websockets-12.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c4e37d36f0d19f0a4413d3e18c0d03d0c268ada2061868c1e6f5ab1a6d575077"}, {file = "websockets-13.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a1ab8f0e0cadc5be5f3f9fa11a663957fecbf483d434762c8dfb8aa44948944a"},
{file = "websockets-12.0-pp310-pypy310_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d829f975fc2e527a3ef2f9c8f25e553eb7bc779c6665e8e1d52aa22800bb38b"}, {file = "websockets-13.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3670def5d3dfd5af6f6e2b3b243ea8f1f72d8da1ef927322f0703f85c90d9603"},
{file = "websockets-12.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:2c71bd45a777433dd9113847af751aae36e448bc6b8c361a566cb043eda6ec30"}, {file = "websockets-13.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6058b6be92743358885ad6dcdecb378fde4a4c74d4dd16a089d07580c75a0e80"},
{file = "websockets-12.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:0bee75f400895aef54157b36ed6d3b308fcab62e5260703add87f44cee9c82a6"}, {file = "websockets-13.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:516062a0a8ef5ecbfa4acbaec14b199fc070577834f9fe3d40800a99f92523ca"},
{file = "websockets-12.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:423fc1ed29f7512fceb727e2d2aecb952c46aa34895e9ed96071821309951123"}, {file = "websockets-13.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:da7e918d82e7bdfc6f66d31febe1b2e28a1ca3387315f918de26f5e367f61572"},
{file = "websockets-12.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:27a5e9964ef509016759f2ef3f2c1e13f403725a5e6a1775555994966a66e931"}, {file = "websockets-13.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:9cc7f35dcb49a4e32db82a849fcc0714c4d4acc9d2273aded2d61f87d7f660b7"},
{file = "websockets-12.0-pp38-pypy38_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c3181df4583c4d3994d31fb235dc681d2aaad744fbdbf94c4802485ececdecf2"}, {file = "websockets-13.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:f5737c53eb2c8ed8f64b50d3dafd3c1dae739f78aa495a288421ac1b3de82717"},
{file = "websockets-12.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:b067cb952ce8bf40115f6c19f478dc71c5e719b7fbaa511359795dfd9d1a6468"}, {file = "websockets-13.0-cp39-cp39-win32.whl", hash = "sha256:265e1f0d3f788ce8ef99dca591a1aec5263b26083ca0934467ad9a1d1181067c"},
{file = "websockets-12.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:00700340c6c7ab788f176d118775202aadea7602c5cc6be6ae127761c16d6b0b"}, {file = "websockets-13.0-cp39-cp39-win_amd64.whl", hash = "sha256:4d70c89e3d3b347a7c4d3c33f8d323f0584c9ceb69b82c2ef8a174ca84ea3d4a"},
{file = "websockets-12.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e469d01137942849cff40517c97a30a93ae79917752b34029f0ec72df6b46399"}, {file = "websockets-13.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:602cbd010d8c21c8475f1798b705bb18567eb189c533ab5ef568bc3033fdf417"},
{file = "websockets-12.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffefa1374cd508d633646d51a8e9277763a9b78ae71324183693959cf94635a7"}, {file = "websockets-13.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:bf8eb5dca4f484a60f5327b044e842e0d7f7cdbf02ea6dc4a4f811259f1f1f0b"},
{file = "websockets-12.0-pp39-pypy39_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba0cab91b3956dfa9f512147860783a1829a8d905ee218a9837c18f683239611"}, {file = "websockets-13.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:89d795c1802d99a643bf689b277e8604c14b5af1bc0a31dade2cd7a678087212"},
{file = "websockets-12.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:2cb388a5bfb56df4d9a406783b7f9dbefb888c09b71629351cc6b036e9259370"}, {file = "websockets-13.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:788bc841d250beccff67a20a5a53a15657a60111ef9c0c0a97fbdd614fae0fe2"},
{file = "websockets-12.0-py3-none-any.whl", hash = "sha256:dc284bbc8d7c78a6c69e0c7325ab46ee5e40bb4d50e494d8131a07ef47500e9e"}, {file = "websockets-13.0-pp310-pypy310_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7334752052532c156d28b8eaf3558137e115c7871ea82adff69b6d94a7bee273"},
{file = "websockets-12.0.tar.gz", hash = "sha256:81df9cbcbb6c260de1e007e58c011bfebe2dafc8435107b0537f393dd38c8b1b"}, {file = "websockets-13.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:e7a1963302947332c3039e3f66209ec73b1626f8a0191649e0713c391e9f5b0d"},
{file = "websockets-13.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:2e1cf4e1eb84b4fd74a47688e8b0940c89a04ad9f6937afa43d468e71128cd68"},
{file = "websockets-13.0-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:c026ee729c4ce55708a14b839ba35086dfae265fc12813b62d34ce33f4980c1c"},
{file = "websockets-13.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f5f9d23fbbf96eefde836d9692670bfc89e2d159f456d499c5efcf6a6281c1af"},
{file = "websockets-13.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ad684cb7efce227d756bae3e8484f2e56aa128398753b54245efdfbd1108f2c"},
{file = "websockets-13.0-pp38-pypy38_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e1e10b3fbed7be4a59831d3a939900e50fcd34d93716e433d4193a4d0d1d335d"},
{file = "websockets-13.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:d42a818e634f789350cd8fb413a3f5eec1cf0400a53d02062534c41519f5125c"},
{file = "websockets-13.0-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:e5ba5e9b332267d0f2c33ede390061850f1ac3ee6cd1bdcf4c5ea33ead971966"},
{file = "websockets-13.0-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:f9af457ed593e35f467140d8b61d425495b127744a9d65d45a366f8678449a23"},
{file = "websockets-13.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bcea3eb58c09c3a31cc83b45c06d5907f02ddaf10920aaa6443975310f699b95"},
{file = "websockets-13.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c210d1460dc8d326ffdef9703c2f83269b7539a1690ad11ae04162bc1878d33d"},
{file = "websockets-13.0-pp39-pypy39_pp73-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b32f38bc81170fd56d0482d505b556e52bf9078b36819a8ba52624bd6667e39e"},
{file = "websockets-13.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:81a11a1ddd5320429db47c04d35119c3e674d215173d87aaeb06ae80f6e9031f"},
{file = "websockets-13.0-py3-none-any.whl", hash = "sha256:dbbac01e80aee253d44c4f098ab3cc17c822518519e869b284cfbb8cd16cc9de"},
{file = "websockets-13.0.tar.gz", hash = "sha256:b7bf950234a482b7461afdb2ec99eee3548ec4d53f418c7990bb79c620476602"},
] ]
[[package]] [[package]]

View File

@@ -1,6 +1,6 @@
[tool.poetry] [tool.poetry]
name = "fastanime" name = "fastanime"
version = "2.2.5" version = "2.4.0"
description = "A browser anime site experience from the terminal" description = "A browser anime site experience from the terminal"
authors = ["Benextempest <benextempest@gmail.com>"] authors = ["Benextempest <benextempest@gmail.com>"]
license = "UNLICENSE" license = "UNLICENSE"

View File

@@ -1,4 +1,5 @@
{ {
"typeCheckingMode": "standard", "venvPath": ".",
"reportPrivateImportUsage": false "venv": ".venv",
"pythonVersion": "3.10"
} }