{"id":5796,"date":"2025-10-09T10:25:07","date_gmt":"2025-10-09T17:25:07","guid":{"rendered":"https:\/\/www.visla.us\/blog\/?p=5796"},"modified":"2025-10-09T10:25:18","modified_gmt":"2025-10-09T17:25:18","slug":"everything-you-need-to-know-about-the-sora-2-launch","status":"publish","type":"post","link":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/","title":{"rendered":"Everything you need to know about the Sora 2 launch"},"content":{"rendered":"\n<p>The launch of OpenAI\u2019s Sora 2 marks a major step forward in AI-generated video, blending realistic motion, synchronized sound, and new creative tools in one platform. You\u2019ll learn the best ways to write prompts, how OpenAI handles copyright and transparency, and whether businesses can use it effectively. Finally, we\u2019ll look at how Sora 2 connects to Visla, showing how these two platforms serve completely different but complementary roles in modern video creation.<\/p>\n\n\n\n<h2 class=\"wp-block-heading is-style-default\">Quick answer: What is Sora 2<\/h2>\n\n\n\n<p><a href=\"https:\/\/openai.com\/index\/sora-2\/\" target=\"_blank\" rel=\"noreferrer noopener\">Sora 2<\/a> is OpenAI\u2019s next-generation text-to-video model that creates lifelike video clips paired with synchronized audio. It\u2019s built to understand how the real world works, not just how it looks. That means you can type a short prompt like \u201ca skateboarder lands a trick at sunset\u201d and get a cinematic, physics-consistent result instead of a surreal, glitchy one. The goal is to make video generation as natural and expressive as writing a prompt.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">How is Sora 2 different from Sora 1?<\/h3>\n\n\n\n<p>Sora 2 builds on the foundation of the <a href=\"https:\/\/www.visla.us\/blog\/guides\/what-is-openais-sora\/\" target=\"_blank\" rel=\"noreferrer noopener\">first Sora model<\/a>, which was a much simpler text-to-video tool. The biggest leap is in realism and control. Sora 2 can now generate both video and audio in sync, handle multi-shot sequences, and simulate real-world physics more reliably. It also introduces <a href=\"https:\/\/help.openai.com\/en\/articles\/12435986-generating-content-with-cameos\" target=\"_blank\" rel=\"noreferrer noopener\">Cameos<\/a>, an opt-in feature that lets you create a digital likeness of yourself for use in your own videos (or by others, if you choose to share it).<\/p>\n\n\n\n<p>You should notice smoother camera movement, better object interaction, and much more accurate lighting and perspective. OpenAI calls this update a move toward \u201c<a href=\"https:\/\/www.wsj.com\/tech\/ai\/world-models-ai-evolution-11275913?gaa_at=eafs&amp;gaa_n=ASWzDAh-hph3mLchDRnZd3RxbElDACeBDEZxyFDzwUaqpIKJlI79NbYfDu0atXrjkdM%3D&amp;gaa_ts=68e7054f&amp;gaa_sig=JQHoh1pOyWRB0DxHZ8jQQHg_QC43HSpcvmOh5nUscUUO3-PEay50xrem0RAMMGndYToAYx5PmHuqxyjVsiAICg%3D%3D\" target=\"_blank\" rel=\"noreferrer noopener\">world simulation<\/a>,\u201d meaning Sora 2 understands how actions and reactions play out physically. It\u2019s not perfect, but it\u2019s significantly closer to film-quality generation than any previous version.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">When did Sora 2 launch?<\/h3>\n\n\n\n<p>OpenAI launched <a href=\"https:\/\/openai.com\/index\/sora-2\/\" target=\"_blank\" rel=\"noreferrer noopener\">Sora 2<\/a> on September 30, 2025. The rollout included the Sora iOS app (available by invite in the U.S. and Canada) and sora.com, a web version that opens up after you\u2019re invited. Developers also got access to the <a href=\"https:\/\/platform.openai.com\/docs\/models\/sora-2\" target=\"_blank\" rel=\"noreferrer noopener\">Sora Video API<\/a>, with clear pricing tiers for both the standard and pro models. While early access is limited, OpenAI plans to expand to other regions and user tiers soon.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">What can you do with Sora 2?<\/h2>\n\n\n\n<p>At its core, Sora 2 turns short text prompts into detailed, realistic video clips. It also generates corresponding audio, including environmental sounds, speech, and music where appropriate. You can guide it with prompts, reference images, and Cameos to create videos that range from short creative scenes to stylized clips for social media or prototypes for commercial use.<\/p>\n\n\n\n<p>Sora 2 offers controls for clip orientation (portrait or landscape), video length, and overall style. Everything you generate comes watermarked and embedded with C2PA metadata to ensure transparency about AI creation.<\/p>\n\n\n\n<div class=\"wp-block-group has-accent-background-color has-background has-global-padding is-layout-constrained wp-container-core-group-is-layout-c385debf wp-block-group-is-layout-constrained\" style=\"border-radius:10px;padding-top:var(--wp--preset--spacing--20);padding-right:var(--wp--preset--spacing--20);padding-bottom:var(--wp--preset--spacing--20);padding-left:var(--wp--preset--spacing--20)\">\n<h3 class=\"wp-block-heading\">The best practices for writing prompts for Sora 2<\/h3>\n\n\n\n<p>Getting great results with Sora 2 comes down to how you write your prompts. Here are a few field-tested principles:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Be specific but cinematic.<\/strong> Write as if you\u2019re briefing a cinematographer. Describe the camera angle, motion, lighting, and subject in action.<br><\/li>\n\n\n\n<li><strong>Structure your prompt.<\/strong> Break complex scenes into \u201cshot blocks,\u201d each with one clear camera setup and action.<br><\/li>\n\n\n\n<li><strong>Include sensory details.<\/strong> Mention sounds, atmosphere, and mood if they matter. Sora 2 generates audio as well as visuals.<br><\/li>\n\n\n\n<li><strong>Use realistic constraints.<\/strong> Avoid overloading your prompt with too many characters or impossible actions. The model performs best with grounded scenarios.<br><\/li>\n\n\n\n<li><strong>Iterate systematically.<\/strong> Once you get something close, adjust one variable at a time, like lighting or lens type, to refine your look.<\/li>\n<\/ol>\n\n\n\n<p>Here\u2019s a simple example prompt that works well:<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-style-plain is-layout-flow wp-block-quote-is-layout-flow is-style-plain--1\">\n<p>Medium shot of a runner on a foggy morning trail, natural camera shake, warm sunrise light filtering through trees, soft footsteps, and birdsong.<\/p>\n<\/blockquote>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"1280\" style=\"aspect-ratio: 704 \/ 1280;\" width=\"704\" controls src=\"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/10\/Sora-Video-Oct-8-2025.mp4\" playsinline><\/video><figcaption class=\"wp-element-caption\"><em>This video was generated using the exact prompt above. <\/em><\/figcaption><\/figure>\n\n\n\n<p>That structure gives Sora 2 enough guidance to produce a grounded, coherent scene.<\/p>\n<\/div>\n\n\n\n<h3 class=\"wp-block-heading\">How does Sora 2 handle copyrighted material?<\/h3>\n\n\n\n<p>Sora 2 uses a combination of safety filters and provenance controls. It blocks prompts that try to generate public figures or copyrighted characters. It also prevents uploads that depict real people without consent. All downloads include a moving watermark and <a href=\"https:\/\/c2pa.org\/\" target=\"_blank\" rel=\"noreferrer noopener\">C2PA metadata<\/a> to verify authenticity.<\/p>\n\n\n\n<p>If a rights holder reports an issue, OpenAI has internal systems to flag, trace, and remove the asset. The company is also building rights management tools that allow creators and brands to control how their likenesses and characters appear (or don\u2019t appear) in the system.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Do you need the Sora 2 iPhone app to generate videos?<\/h3>\n\n\n\n<p>No. You can use the Sora iOS app or <a href=\"http:\/\/sora.com\">sora.com<\/a> to create video once your account is approved. The app focuses on quick creation and remixing, while the web version is better for editing and downloading. Developers and businesses can use the Sora API directly, so you\u2019re not limited to mobile.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Why are there so many vertical videos on Sora 2?<\/h3>\n\n\n\n<p>Sora 2\u2019s app defaults to vertical orientation because it\u2019s built around a social-style feed and content discovery. That doesn\u2019t mean it\u2019s only for TikTok-style clips, though. You can easily switch to landscape in your settings or prompt for a horizontal shot directly. Vertical video simply works better for handheld, first-person, and selfie-style content, which fits how people experiment creatively inside the app.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Can businesses use Sora 2?<\/h2>\n\n\n\n<p>Yes, though it depends on your goals. Sora 2 is powerful for experimentation, storyboarding, and creative ideation. You can test visual concepts, create short promotional clips, or prototype campaign ideas quickly. However, for brand-consistent or high-stakes commercial projects, you\u2019ll need a platform that builds structure around Sora\u2019s raw output. That&#8217;s where tools like Visla come in.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Do videos generated by Sora 2 look good?<\/h3>\n\n\n\n<p>The quality is impressive, especially for short-form content. <a href=\"https:\/\/platform.openai.com\/docs\/models\/sora-2-pro\" target=\"_blank\" rel=\"noreferrer noopener\">Sora 2 Pro (the higher-end API model)<\/a> supports higher resolutions and more consistent detail across frames. Motion is smoother, physics make more sense, and small elements like shadows, reflections, and hair movement look much more believable.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p>Still, it\u2019s not perfect. Fast motion can smear slightly, and character consistency can drift over long clips. Cameos sometimes mispronounce lines or misrender facial details, especially in low light. For most social or creative uses, these aren\u2019t dealbreakers. But for polished brand storytelling, pairing Sora with professional editing tools helps finish the job.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Do people want to see Sora 2 videos?<\/h3>\n\n\n\n<p>The early response has been mixed in a healthy way. Audiences are impressed by realism but cautious about authenticity. People respond best when AI content feels purposeful rather than gimmicky.<\/p>\n\n\n\n<p>For example, a short AI-generated product demo or story concept can grab attention when it\u2019s framed as a prototype or concept. But full ads or influencer-style videos using AI likenesses tend to spark more debate. Transparency helps, and that\u2019s why Sora\u2019s watermarking and C2PA data matter.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">What about Visla and Sora 2?<\/h2>\n\n\n\n<p>There\u2019s no overlap between what Sora 2 and <a href=\"https:\/\/www.visla.us\/\" target=\"_blank\" rel=\"noreferrer noopener\">Visla<\/a> do. Actually, the two tools complement each other.<\/p>\n\n\n\n<p>Sora 2 is designed for clip generation. It creates a piece of video (with optional audio) from your imagination, a reference image, or a Cameo. Think of it as a creative spark: an isolated shot, moment, sequence, or a potential piece of b-roll.<\/p>\n\n\n\n<p>Visla, on the other hand, focuses on full video storytelling. You can take Sora clips, or any raw footage, into Visla and build complete branded stories. Visla handles the <a href=\"https:\/\/www.visla.us\/ai-video-generator\" target=\"_blank\" rel=\"noreferrer noopener\">script<\/a>, <a href=\"https:\/\/www.visla.us\/auto-subtitle-generator\" target=\"_blank\" rel=\"noreferrer noopener\">subtitles<\/a>, <a href=\"https:\/\/www.visla.us\/ai-voice-over\" target=\"_blank\" rel=\"noreferrer noopener\">voiceover<\/a>, <a href=\"https:\/\/www.visla.us\/background-music\" target=\"_blank\" rel=\"noreferrer noopener\">background music<\/a>, <a href=\"https:\/\/www.visla.us\/premium-video-library\" target=\"_blank\" rel=\"noreferrer noopener\">additional footage<\/a>, and more that transform a single clip into a shareable, professional video.<\/p>\n\n\n\n<p>In other words, Sora gives you the creative raw material, and Visla helps you shape that material into a finished, cohesive narrative that aligns with your brand and audience.<\/p>\n\n\n\n<figure class=\"wp-block-table is-style-stripes\"><table class=\"has-fixed-layout\"><thead><tr><th><strong>Feature<\/strong><\/th><th><strong>Sora 2<\/strong><\/th><th><strong>Visla<\/strong><\/th><\/tr><\/thead><tbody><tr><td><strong>Purpose<\/strong><\/td><td>Generate short video and audio clips<\/td><td>Create full branded, narrative videos<\/td><\/tr><tr><td><strong>Focus<\/strong><\/td><td>Imagination, physics realism, creative shots<\/td><td>Editing, scripting, and brand storytelling<\/td><\/tr><tr><td><strong>Input<\/strong><\/td><td>Text prompts, Cameos, optional reference images<\/td><td>Uploaded clips (AI or real footage), scripts, brand kits<\/td><\/tr><tr><td><strong>Output<\/strong><\/td><td>Watermarked videos with C2PA data<\/td><td>Branded, shareable videos with full structure<\/td><\/tr><tr><td><strong>Best for<\/strong><\/td><td>Ideation, experimentation, visual concepting<\/td><td>Marketing, training, storytelling, and publishing<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p>Together, these tools represent two sides of the new creative process. Sora 2 helps you visualize ideas instantly, and Visla helps you refine them into something ready for an audience.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link wp-element-button\">Upload your Sora 2 clips to Visla<\/a><\/div>\n<\/div>\n\n\n\n<h2 class=\"wp-block-heading\">FAQ<\/h2>\n\n\n\n<div class=\"schema-faq wp-block-yoast-faq-block\"><div class=\"schema-faq-section\" id=\"faq-question-1759970487743\"><strong class=\"schema-faq-question\">Who owns Sora 2 outputs and can I use them commercially?<\/strong> <p class=\"schema-faq-answer\">You own the outputs you create with Sora 2, as long as you follow OpenAI\u2019s terms and applicable laws. That means you can use them commercially, but you must avoid infringing anyone\u2019s copyrights, trademarks, or publicity rights. If you violate the terms, you can lose rights to use or distribute those outputs and you may face claims from third parties. If your organization uses the API under a services agreement, your customer content stays under your control and OpenAI won\u2019t use it to improve the services unless you explicitly agree.<\/p> <\/div> <div class=\"schema-faq-section\" id=\"faq-question-1760027658600\"><strong class=\"schema-faq-question\"><strong>What happens to the watermark and C2PA metadata when I edit or publish my videos?<\/strong><\/strong> <p class=\"schema-faq-answer\">Sora 2 downloads include a visible moving watermark and embed C2PA provenance metadata. Many social platforms and transcode tools strip metadata during upload or re-encoding, so the C2PA record may not always survive distribution. That\u2019s why best practice is to keep a source-of-truth master file and to configure your pipeline to preserve content credentials when possible. Plan for clear disclosure in your captions or credits if downstream platforms remove metadata and remember that removing or obscuring watermarks undermines transparency and audience trust.<\/p> <\/div> <div class=\"schema-faq-section\" id=\"faq-question-1760027667705\"><strong class=\"schema-faq-question\">What are the practical limits and costs across the app and the API?<\/strong> <p class=\"schema-faq-answer\">In the app, generations run under a rolling 24-hour limit per account, so submissions count against your quota for a full day from each request. The API exposes precise controls: you select a model, a supported resolution, and a fixed clip length. Today the documented API durations are 4, 8, or 12 seconds, with resolutions of 1280\u00d7720 or 720\u00d71280 on sora\u20112, plus 1792\u00d71024 or 1024\u00d71792 on sora\u20112\u2011pro. Pricing is per generated second and scales with the chosen model and resolution, so teams should forecast costs by storyboard beat and clip count.<\/p> <\/div> <div class=\"schema-faq-section\" id=\"faq-question-1760027683369\"><strong class=\"schema-faq-question\">How does OpenAI handle my prompts, videos, and cameos from a privacy and data\u2011use perspective?<\/strong> <p class=\"schema-faq-answer\">For business and enterprise API use, OpenAI states it won\u2019t use your customer content to develop or improve the services unless you opt in. For consumer use, check the current Terms of Use and privacy settings to see how your data may be used for safety and service operations. Sensitive content like likeness and voice receives extra guardrails and auditability inside the Sora app. Regardless of account type, you should avoid uploading media you don\u2019t have rights to and you should keep internal records of permissions and disclosures for compliance.<\/p> <\/div> <\/div>\n","protected":false},"excerpt":{"rendered":"<p>The launch of OpenAI\u2019s Sora 2 marks a major step forward in AI-generated video, blending realistic motion, synchronized sound, and new creative tools in one platform. You\u2019ll learn the best ways to write prompts, how OpenAI handles copyright and transparency, and whether businesses can use it effectively. Finally, we\u2019ll look at how Sora 2 connects [&hellip;]<\/p>\n","protected":false},"author":9,"featured_media":5810,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[23],"tags":[],"class_list":["post-5796","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.2 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Everything you need to know about the Sora 2 launch - The Visla Blog<\/title>\n<meta name=\"description\" content=\"OpenAI&#039;s Sora 2 launch brings realistic video and audio generation, better physics, and new creator tools to their new website, app, and API.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Everything you need to know about the Sora 2 launch - The Visla Blog\" \/>\n<meta property=\"og:description\" content=\"OpenAI&#039;s Sora 2 launch brings realistic video and audio generation, better physics, and new creator tools to their new website, app, and API.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/\" \/>\n<meta property=\"og:site_name\" content=\"The Visla Blog\" \/>\n<meta property=\"article:published_time\" content=\"2025-10-09T17:25:07+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-10-09T17:25:18+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/10\/Thumbnail-1-1.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1280\" \/>\n\t<meta property=\"og:image:height\" content=\"720\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"May Horiuchi\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"May Horiuchi\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/\"},\"author\":{\"name\":\"May Horiuchi\",\"@id\":\"https:\/\/www.visla.us\/blog\/#\/schema\/person\/dcb20e581baf8b9574924cab20d6ae6d\"},\"headline\":\"Everything you need to know about the Sora 2 launch\",\"datePublished\":\"2025-10-09T17:25:07+00:00\",\"dateModified\":\"2025-10-09T17:25:18+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/\"},\"wordCount\":1779,\"publisher\":{\"@id\":\"https:\/\/www.visla.us\/blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/10\/Thumbnail-1-1.jpg\",\"articleSection\":[\"News\"],\"inLanguage\":\"en-US\"},{\"@type\":[\"WebPage\",\"FAQPage\"],\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/\",\"url\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/\",\"name\":\"Everything you need to know about the Sora 2 launch - The Visla Blog\",\"isPartOf\":{\"@id\":\"https:\/\/www.visla.us\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/10\/Thumbnail-1-1.jpg\",\"datePublished\":\"2025-10-09T17:25:07+00:00\",\"dateModified\":\"2025-10-09T17:25:18+00:00\",\"description\":\"OpenAI's Sora 2 launch brings realistic video and audio generation, better physics, and new creator tools to their new website, app, and API.\",\"breadcrumb\":{\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#breadcrumb\"},\"mainEntity\":[{\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1759970487743\"},{\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027658600\"},{\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027667705\"},{\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027683369\"}],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#primaryimage\",\"url\":\"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/10\/Thumbnail-1-1.jpg\",\"contentUrl\":\"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/10\/Thumbnail-1-1.jpg\",\"width\":1280,\"height\":720},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.visla.us\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Everything you need to know about the Sora 2 launch\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.visla.us\/blog\/#website\",\"url\":\"https:\/\/www.visla.us\/blog\/\",\"name\":\"The Visla Blog\",\"description\":\"Learn about AI video.\",\"publisher\":{\"@id\":\"https:\/\/www.visla.us\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.visla.us\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.visla.us\/blog\/#organization\",\"name\":\"The Visla Blog\",\"url\":\"https:\/\/www.visla.us\/blog\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.visla.us\/blog\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/03\/Image-brand-color-m.png\",\"contentUrl\":\"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/03\/Image-brand-color-m.png\",\"width\":270,\"height\":235,\"caption\":\"The Visla Blog\"},\"image\":{\"@id\":\"https:\/\/www.visla.us\/blog\/#\/schema\/logo\/image\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.visla.us\/blog\/#\/schema\/person\/dcb20e581baf8b9574924cab20d6ae6d\",\"name\":\"May Horiuchi\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.visla.us\/wp-content\/uploads\/2024\/06\/IMG_6108-2.jpg\",\"url\":\"https:\/\/www.visla.us\/wp-content\/uploads\/2024\/06\/IMG_6108-2.jpg\",\"contentUrl\":\"https:\/\/www.visla.us\/wp-content\/uploads\/2024\/06\/IMG_6108-2.jpg\",\"caption\":\"May Horiuchi\"},\"description\":\"May is a Content Specialist and AI Expert for Visla. She is an in-house expert on anything Visla and loves testing out different AI tools to figure out which ones are actually helpful and useful for content creators, businesses, and organizations.\",\"url\":\"https:\/\/www.visla.us\/blog\/author\/mark-horiuchi\/\"},{\"@type\":\"Question\",\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1759970487743\",\"position\":1,\"url\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1759970487743\",\"name\":\"Who owns Sora 2 outputs and can I use them commercially?\",\"answerCount\":1,\"acceptedAnswer\":{\"@type\":\"Answer\",\"text\":\"You own the outputs you create with Sora 2, as long as you follow OpenAI\u2019s terms and applicable laws. That means you can use them commercially, but you must avoid infringing anyone\u2019s copyrights, trademarks, or publicity rights. If you violate the terms, you can lose rights to use or distribute those outputs and you may face claims from third parties. If your organization uses the API under a services agreement, your customer content stays under your control and OpenAI won\u2019t use it to improve the services unless you explicitly agree.\",\"inLanguage\":\"en-US\"},\"inLanguage\":\"en-US\"},{\"@type\":\"Question\",\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027658600\",\"position\":2,\"url\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027658600\",\"name\":\"What happens to the watermark and C2PA metadata when I edit or publish my videos?\",\"answerCount\":1,\"acceptedAnswer\":{\"@type\":\"Answer\",\"text\":\"Sora 2 downloads include a visible moving watermark and embed C2PA provenance metadata. Many social platforms and transcode tools strip metadata during upload or re-encoding, so the C2PA record may not always survive distribution. That\u2019s why best practice is to keep a source-of-truth master file and to configure your pipeline to preserve content credentials when possible. Plan for clear disclosure in your captions or credits if downstream platforms remove metadata and remember that removing or obscuring watermarks undermines transparency and audience trust.\",\"inLanguage\":\"en-US\"},\"inLanguage\":\"en-US\"},{\"@type\":\"Question\",\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027667705\",\"position\":3,\"url\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027667705\",\"name\":\"What are the practical limits and costs across the app and the API?\",\"answerCount\":1,\"acceptedAnswer\":{\"@type\":\"Answer\",\"text\":\"In the app, generations run under a rolling 24-hour limit per account, so submissions count against your quota for a full day from each request. The API exposes precise controls: you select a model, a supported resolution, and a fixed clip length. Today the documented API durations are 4, 8, or 12 seconds, with resolutions of 1280\u00d7720 or 720\u00d71280 on sora\u20112, plus 1792\u00d71024 or 1024\u00d71792 on sora\u20112\u2011pro. Pricing is per generated second and scales with the chosen model and resolution, so teams should forecast costs by storyboard beat and clip count.\",\"inLanguage\":\"en-US\"},\"inLanguage\":\"en-US\"},{\"@type\":\"Question\",\"@id\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027683369\",\"position\":4,\"url\":\"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027683369\",\"name\":\"How does OpenAI handle my prompts, videos, and cameos from a privacy and data\u2011use perspective?\",\"answerCount\":1,\"acceptedAnswer\":{\"@type\":\"Answer\",\"text\":\"For business and enterprise API use, OpenAI states it won\u2019t use your customer content to develop or improve the services unless you opt in. For consumer use, check the current Terms of Use and privacy settings to see how your data may be used for safety and service operations. Sensitive content like likeness and voice receives extra guardrails and auditability inside the Sora app. Regardless of account type, you should avoid uploading media you don\u2019t have rights to and you should keep internal records of permissions and disclosures for compliance.\",\"inLanguage\":\"en-US\"},\"inLanguage\":\"en-US\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Everything you need to know about the Sora 2 launch - The Visla Blog","description":"OpenAI's Sora 2 launch brings realistic video and audio generation, better physics, and new creator tools to their new website, app, and API.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/","og_locale":"en_US","og_type":"article","og_title":"Everything you need to know about the Sora 2 launch - The Visla Blog","og_description":"OpenAI's Sora 2 launch brings realistic video and audio generation, better physics, and new creator tools to their new website, app, and API.","og_url":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/","og_site_name":"The Visla Blog","article_published_time":"2025-10-09T17:25:07+00:00","article_modified_time":"2025-10-09T17:25:18+00:00","og_image":[{"width":1280,"height":720,"url":"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/10\/Thumbnail-1-1.jpg","type":"image\/jpeg"}],"author":"May Horiuchi","twitter_card":"summary_large_image","twitter_misc":{"Written by":"May Horiuchi","Est. reading time":"8 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#article","isPartOf":{"@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/"},"author":{"name":"May Horiuchi","@id":"https:\/\/www.visla.us\/blog\/#\/schema\/person\/dcb20e581baf8b9574924cab20d6ae6d"},"headline":"Everything you need to know about the Sora 2 launch","datePublished":"2025-10-09T17:25:07+00:00","dateModified":"2025-10-09T17:25:18+00:00","mainEntityOfPage":{"@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/"},"wordCount":1779,"publisher":{"@id":"https:\/\/www.visla.us\/blog\/#organization"},"image":{"@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#primaryimage"},"thumbnailUrl":"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/10\/Thumbnail-1-1.jpg","articleSection":["News"],"inLanguage":"en-US"},{"@type":["WebPage","FAQPage"],"@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/","url":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/","name":"Everything you need to know about the Sora 2 launch - The Visla Blog","isPartOf":{"@id":"https:\/\/www.visla.us\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#primaryimage"},"image":{"@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#primaryimage"},"thumbnailUrl":"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/10\/Thumbnail-1-1.jpg","datePublished":"2025-10-09T17:25:07+00:00","dateModified":"2025-10-09T17:25:18+00:00","description":"OpenAI's Sora 2 launch brings realistic video and audio generation, better physics, and new creator tools to their new website, app, and API.","breadcrumb":{"@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#breadcrumb"},"mainEntity":[{"@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1759970487743"},{"@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027658600"},{"@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027667705"},{"@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027683369"}],"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#primaryimage","url":"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/10\/Thumbnail-1-1.jpg","contentUrl":"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/10\/Thumbnail-1-1.jpg","width":1280,"height":720},{"@type":"BreadcrumbList","@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.visla.us\/blog\/"},{"@type":"ListItem","position":2,"name":"Everything you need to know about the Sora 2 launch"}]},{"@type":"WebSite","@id":"https:\/\/www.visla.us\/blog\/#website","url":"https:\/\/www.visla.us\/blog\/","name":"The Visla Blog","description":"Learn about AI video.","publisher":{"@id":"https:\/\/www.visla.us\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.visla.us\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.visla.us\/blog\/#organization","name":"The Visla Blog","url":"https:\/\/www.visla.us\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.visla.us\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/03\/Image-brand-color-m.png","contentUrl":"https:\/\/www.visla.us\/wp-content\/uploads\/2025\/03\/Image-brand-color-m.png","width":270,"height":235,"caption":"The Visla Blog"},"image":{"@id":"https:\/\/www.visla.us\/blog\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/www.visla.us\/blog\/#\/schema\/person\/dcb20e581baf8b9574924cab20d6ae6d","name":"May Horiuchi","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.visla.us\/wp-content\/uploads\/2024\/06\/IMG_6108-2.jpg","url":"https:\/\/www.visla.us\/wp-content\/uploads\/2024\/06\/IMG_6108-2.jpg","contentUrl":"https:\/\/www.visla.us\/wp-content\/uploads\/2024\/06\/IMG_6108-2.jpg","caption":"May Horiuchi"},"description":"May is a Content Specialist and AI Expert for Visla. She is an in-house expert on anything Visla and loves testing out different AI tools to figure out which ones are actually helpful and useful for content creators, businesses, and organizations.","url":"https:\/\/www.visla.us\/blog\/author\/mark-horiuchi\/"},{"@type":"Question","@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1759970487743","position":1,"url":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1759970487743","name":"Who owns Sora 2 outputs and can I use them commercially?","answerCount":1,"acceptedAnswer":{"@type":"Answer","text":"You own the outputs you create with Sora 2, as long as you follow OpenAI\u2019s terms and applicable laws. That means you can use them commercially, but you must avoid infringing anyone\u2019s copyrights, trademarks, or publicity rights. If you violate the terms, you can lose rights to use or distribute those outputs and you may face claims from third parties. If your organization uses the API under a services agreement, your customer content stays under your control and OpenAI won\u2019t use it to improve the services unless you explicitly agree.","inLanguage":"en-US"},"inLanguage":"en-US"},{"@type":"Question","@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027658600","position":2,"url":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027658600","name":"What happens to the watermark and C2PA metadata when I edit or publish my videos?","answerCount":1,"acceptedAnswer":{"@type":"Answer","text":"Sora 2 downloads include a visible moving watermark and embed C2PA provenance metadata. Many social platforms and transcode tools strip metadata during upload or re-encoding, so the C2PA record may not always survive distribution. That\u2019s why best practice is to keep a source-of-truth master file and to configure your pipeline to preserve content credentials when possible. Plan for clear disclosure in your captions or credits if downstream platforms remove metadata and remember that removing or obscuring watermarks undermines transparency and audience trust.","inLanguage":"en-US"},"inLanguage":"en-US"},{"@type":"Question","@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027667705","position":3,"url":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027667705","name":"What are the practical limits and costs across the app and the API?","answerCount":1,"acceptedAnswer":{"@type":"Answer","text":"In the app, generations run under a rolling 24-hour limit per account, so submissions count against your quota for a full day from each request. The API exposes precise controls: you select a model, a supported resolution, and a fixed clip length. Today the documented API durations are 4, 8, or 12 seconds, with resolutions of 1280\u00d7720 or 720\u00d71280 on sora\u20112, plus 1792\u00d71024 or 1024\u00d71792 on sora\u20112\u2011pro. Pricing is per generated second and scales with the chosen model and resolution, so teams should forecast costs by storyboard beat and clip count.","inLanguage":"en-US"},"inLanguage":"en-US"},{"@type":"Question","@id":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027683369","position":4,"url":"https:\/\/www.visla.us\/blog\/news\/everything-you-need-to-know-about-the-sora-2-launch\/#faq-question-1760027683369","name":"How does OpenAI handle my prompts, videos, and cameos from a privacy and data\u2011use perspective?","answerCount":1,"acceptedAnswer":{"@type":"Answer","text":"For business and enterprise API use, OpenAI states it won\u2019t use your customer content to develop or improve the services unless you opt in. For consumer use, check the current Terms of Use and privacy settings to see how your data may be used for safety and service operations. Sensitive content like likeness and voice receives extra guardrails and auditability inside the Sora app. Regardless of account type, you should avoid uploading media you don\u2019t have rights to and you should keep internal records of permissions and disclosures for compliance.","inLanguage":"en-US"},"inLanguage":"en-US"}]}},"_links":{"self":[{"href":"https:\/\/www.visla.us\/blog\/wp-json\/wp\/v2\/posts\/5796","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.visla.us\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.visla.us\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.visla.us\/blog\/wp-json\/wp\/v2\/users\/9"}],"replies":[{"embeddable":true,"href":"https:\/\/www.visla.us\/blog\/wp-json\/wp\/v2\/comments?post=5796"}],"version-history":[{"count":14,"href":"https:\/\/www.visla.us\/blog\/wp-json\/wp\/v2\/posts\/5796\/revisions"}],"predecessor-version":[{"id":5812,"href":"https:\/\/www.visla.us\/blog\/wp-json\/wp\/v2\/posts\/5796\/revisions\/5812"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.visla.us\/blog\/wp-json\/wp\/v2\/media\/5810"}],"wp:attachment":[{"href":"https:\/\/www.visla.us\/blog\/wp-json\/wp\/v2\/media?parent=5796"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.visla.us\/blog\/wp-json\/wp\/v2\/categories?post=5796"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.visla.us\/blog\/wp-json\/wp\/v2\/tags?post=5796"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}