Pygmalion-2 13B is an instruction-tuned language model developed by PygmalionAI, built on Meta AI's Llama-2 13B architecture with 13 billion parameters. The model is specifically fine-tuned for conversational and fictional text generation, utilizing a composite training dataset that includes the proprietary PIPPA dataset, roleplay interactions, and instruction data. It employs a structured prompting format with system, user, and model tokens to facilitate multi-turn dialogues and role-based interactions, making it suitable for creative writing applications and entertainment-focused text generation.
Loading...