Browse Models
The simplest way to self-host Mythalion 13B. Launch a dedicated cloud GPU server running Lab Station OS to download and serve the model using any compatible app or framework.
Download model weights for local inference. Must be used with a compatible app, notebook, or codebase. May run slowly, or not work at all, depending on your system resources, particularly GPU(s) and available VRAM.
Mythalion-13B merges Pygmalion-2 and MythoMax models on the Llama-2 architecture, optimized for conversation and creative writing. It supports both Alpaca and Pygmalion prompt formats, trained on diverse dialogue datasets. Notable for combining existing models rather than training from scratch.
Mythalion-13B represents a significant advancement in large language model development, combining the strengths of two powerful base models: Pygmalion-2 13B and MythoMax 13B. Built on the Llama-2 architecture, this merged model demonstrates enhanced capabilities in role-playing and conversational tasks, as detailed in the PygmalionAI blog post.
The model utilizes the LlamaForCausalLM architecture, implemented through PyTorch and employing Safetensors for model weight storage. With 13 billion parameters, Mythalion-13B is distributed in FP16 format, striking a balance between model capability and computational requirements. The model's development leveraged the Axolotl framework, contributing to its robust implementation.
Mythalion-13B's training incorporated diverse datasets, including:
Early testing indicates that Mythalion-13B outperforms its MythoMax 13B predecessor in both role-playing and chat interactions. However, it's important to note that the model wasn't specifically fine-tuned for safety considerations, and users should be aware it may generate offensive or factually inaccurate content. The model's primary application focus is fictional writing.
The model supports two distinct prompting formats:
Alpaca Format:
Pygmalion/Metharme Format:
<|system|>
, <|user|>
, and <|model|>
tokensFor optimal performance, particularly when using SillyTavern, specific generation settings are recommended, which can be found in the PygmalionAI blog post.
Mythalion-13B is released under the Llama-2 license, which permits both commercial and non-commercial use. This licensing structure makes it particularly accessible for a wide range of applications while maintaining appropriate usage guidelines.