Compare commits

...

30 Commits

Author SHA1 Message Date
a7d5bc436e Fixed the /oscilloscope view to show the correct fields at right locations 2025-12-21 19:51:34 +01:00
gpt-engineer-app[bot]
f3263cd29e Fix intro video and visuals
Introduce reliable intro.webm playback before main video, switch to webm-first intro with proper frame handling; reorganize oscilloscope controls, add spectrum visualization support, and adjust audio inputs placement and mic controls. Update oscilloscope to render spectrum mode and side-by-side layout with live display options.

X-Lovable-Edit-ID: edt-8fca3b32-298d-48b9-97c3-d077b9c82892
2025-12-21 18:10:01 +00:00
gpt-engineer-app[bot]
8ecd8da712 Changes 2025-12-21 18:10:00 +00:00
gpt-engineer-app[bot]
043b06d6ea Add intro to export
Integrate intro.mp4/intro.webm at start of exports with fade-in; fix mobile music player hover behavior; move drop/upload and mic controls beneath oscilloscope content for mobile-friendly layout. Also adjust UI to place audio input below main display.

X-Lovable-Edit-ID: edt-d9fc8a84-3564-4af4-b79d-75465466c78c
2025-12-21 17:45:07 +00:00
gpt-engineer-app[bot]
432537e79f Changes 2025-12-21 17:45:07 +00:00
dd01c54fb6 added intro video which has to be added in the code for the generation of a oscilloscope video 2025-12-21 18:39:27 +01:00
gpt-engineer-app[bot]
c96d9f210e Arrange oscilloscope layout beside controls and shrink mini view on mobile
Move oscilloscope display to sit next to settings/controls (side-by-side layout) and make mini oscilloscope smaller on mobile to avoid blocking terminal and music icons, with updated responsive sizing.

X-Lovable-Edit-ID: edt-09ee86ef-59e1-4283-a96d-d4ea6abbe5cf
2025-12-21 15:18:55 +00:00
gpt-engineer-app[bot]
64fec848fe Changes 2025-12-21 15:18:54 +00:00
gpt-engineer-app[bot]
6693f94b65 Improve oscilloscope init and sync
- Ensure oscilloscope starts immediately and animates at full FPS
- Sync visualization with audio element without relying on slow timeupdate
- Update OscilloscopeDisplay to read from audio element directly and handle seek properly
- Adjust playback integration in Oscilloscope.tsx to pass proper refs and props

X-Lovable-Edit-ID: edt-073ab13c-7eb6-4c9d-b2fd-ac84088b119a
2025-12-21 15:00:46 +00:00
gpt-engineer-app[bot]
2b85bc4825 Changes 2025-12-21 15:00:46 +00:00
gpt-engineer-app[bot]
8fd0ef4f53 Refactor oscilloscope UI
- Rename ControlPanel to OscilloscopeControls for clarity
- Integrate live audio playback with a progress bar and time display
- Remove MP4 export; WebM only with updated export options
- Align oscilloscope components to use shared audio analyzer context
- Enable immediate oscilloscope initialization and live display modes (Combined/XY)
- Improve styling and responsive sizing of mini/full oscilloscope views

X-Lovable-Edit-ID: edt-28a85a31-f522-4420-8353-3be1a784d5c4
2025-12-21 14:55:20 +00:00
gpt-engineer-app[bot]
9997558c3a Changes 2025-12-21 14:55:19 +00:00
gpt-engineer-app[bot]
53f2dc6e96 Fix oscilloscope init and mode
- Ensure oscilloscope renders immediately by initializing audio analyzer context on load
- Simplify live display mode to support only Combined and All/XY, and wire display mode to live visualization
- Adjust panels to reflect new mode options and ensure live data switches render correctly
- Update OscilloscopeDisplay to honor live display mode changes and use shared audio analyzer
- Minor tweaks to MiniOscilloscope for consistent visualization and navigation behavior

X-Lovable-Edit-ID: edt-33738654-f505-4a6a-9854-8683f1b4efdb
2025-12-21 14:30:48 +00:00
gpt-engineer-app[bot]
6e3d0e8918 Changes 2025-12-21 14:30:47 +00:00
gpt-engineer-app[bot]
cc2612918c Clarify oscilloscope modes
- Separate live display controls from export display mode
- Allow live visualization options (line thickness, grid, glow) to affect the real-time graph
- Update Oscilloscope to pass new liveSettings to panels and display components
- Fix duplicate variable declarations in OscilloscopeDisplay drawFrame logic

X-Lovable-Edit-ID: edt-fc124d7f-b9d9-4269-96b8-0925004fe070
2025-12-21 14:24:16 +00:00
gpt-engineer-app[bot]
8173565521 Changes 2025-12-21 14:24:15 +00:00
gpt-engineer-app[bot]
e02bd14b2b Tune oscilloscope visuals
Remove idle NO SIGNAL state, widen mini oscilloscope, align with theme colors, and connect full oscilloscope to shared audio analyzer to visualize music and sound effects continuously. Also updated OscilloscopeDisplay to use shared analyzer data.

X-Lovable-Edit-ID: edt-8830a45a-8e37-44e7-b833-ddf573c55606
2025-12-21 14:09:40 +00:00
gpt-engineer-app[bot]
d4f544168d Changes 2025-12-21 14:09:40 +00:00
gpt-engineer-app[bot]
cdc0f6d45e Fix audio routing
- Implement shared AudioAnalyzer flow by wiring MiniOscilloscope and MusicContext to a single AudioContext and AnalyserNode
- Route music and sound effects through the shared analyzer
- Restore and enlarge mini oscilloscope, ensure it opens /oscilloscope on click
- Update App wiring to include AudioAnalyzerProvider and adapt contexts accordingly

X-Lovable-Edit-ID: edt-787fd745-f007-47ee-b161-626997f20f27
2025-12-21 13:55:17 +00:00
gpt-engineer-app[bot]
a9235fdb3f Changes 2025-12-21 13:55:16 +00:00
gpt-engineer-app[bot]
3f05ec4015 Link mini oscilloscope to music
- Fix connection to music audio for MiniOscilloscope by wiring to MusicContext audioElement and ensuring real-time waveform from active audio sources.
- Update MiniOscilloscope to discover and attach to all current and future audio elements, with idle animation and click navigation to /oscilloscope.
- Adjust MusicContext to expose audioElement for external consumers.

X-Lovable-Edit-ID: edt-731dd6ba-bc18-4933-beb8-3df453876b84
2025-12-21 13:47:32 +00:00
gpt-engineer-app[bot]
56862114ec Changes 2025-12-21 13:47:32 +00:00
gpt-engineer-app[bot]
fd8f1671ca Add live oscilloscope bar
Introduce a new MiniOscilloscope bar at bottom that links to /oscilloscope, modify MainLayout to render it, and fix offline video export scaffolding to support audio chunks and buffer, enabling integration with oscilloscope feature. Adjust layout to ensure accessibility and navigation.

X-Lovable-Edit-ID: edt-b2174f84-2f48-457a-857a-e5f0e9c64c83
2025-12-21 13:39:54 +00:00
gpt-engineer-app[bot]
e23f6b55fb Changes 2025-12-21 13:39:53 +00:00
614811167f Audio Playback Controls
- Playback Speed Control: 0.5x, 1x, 1.5x, and 2x speed options
- Looping Toggle: Enable/disable automatic looping of audio playback
- Seeking: Click anywhere on the oscilloscope display to jump to that position in the audio
 Export Options
- Resolution Selection: Choose from 640×480, 1280×720, or 1920×1080
- Frame Rate Options: 24, 30, or 60 FPS
- Format Selection: WebM or MP4 (if supported)
- Quality Settings: Low, Medium, or High quality presets
 Microphone Calibration
- Real-time Level Monitoring: Visual indicator showing current microphone input level
- Gain Control: Adjustable gain slider (0.1x to 3x) to optimize input levels
- Visual Feedback: Color-coded level indicator (red=too loud, yellow=good, green=too quiet)
- Calibration Mode: Dedicated calibration panel that appears when mic is active
2025-12-21 14:29:17 +01:00
cde5f34858 Command History
- Up/Down Arrow Navigation: Users can now press ↑/↓ to navigate through previously executed commands
- History Storage: Commands are stored in state and persist during the session
- Duplicate Prevention: Avoids adding the same command multiple times consecutively
- Reset on Typing: Manual typing resets the history navigation position
 Tab Completion
- Auto-complete: Pressing Tab completes partial commands that match available commands
- Single Match: If only one command matches, it completes the full command
- Multiple Matches: If multiple commands match, displays all possible completions in the terminal output
- Case-insensitive: Works regardless of input case
2025-12-21 13:57:47 +01:00
1495a9a5c5 Implemented Oscilloscope correctly now
Have tested it with a 5 minute long audio file
Bigger files and different formats still needs testing
2025-12-21 13:51:16 +01:00
ad6587978a Correct files to integrate with the site for a good audio to a/v osciloscope converter
Now has to be only implemented in the actual website
2025-12-21 13:21:20 +01:00
e227743728 Osciloscope integrated but video exported is broken 2025-12-20 15:50:04 +01:00
26584ea848 Before attempting to integrate osciloscope properely. STILL BROKEN NOW 2025-12-20 15:34:07 +01:00
24 changed files with 5726 additions and 111 deletions

BIN
intro.mp4 Normal file

Binary file not shown.

BIN
intro.webm Normal file

Binary file not shown.

1889
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -76,12 +76,16 @@
"eslint": "^9.32.0", "eslint": "^9.32.0",
"eslint-plugin-react-hooks": "^5.2.0", "eslint-plugin-react-hooks": "^5.2.0",
"eslint-plugin-react-refresh": "^0.4.20", "eslint-plugin-react-refresh": "^0.4.20",
"express": "^5.2.1",
"gif.js": "^0.2.0",
"globals": "^15.15.0", "globals": "^15.15.0",
"lovable-tagger": "^1.1.11", "lovable-tagger": "^1.1.11",
"postcss": "^8.5.6", "postcss": "^8.5.6",
"puppeteer": "^24.34.0",
"tailwindcss": "^3.4.17", "tailwindcss": "^3.4.17",
"typescript": "^5.8.3", "typescript": "^5.8.3",
"typescript-eslint": "^8.38.0", "typescript-eslint": "^8.38.0",
"vite": "^7.2.6" "vite": "^7.2.6",
"webm-writer": "^1.0.0"
} }
} }

View File

@ -7,6 +7,7 @@ import { BrowserRouter, Routes, Route } from "react-router-dom";
import { SettingsProvider, useSettings } from "@/contexts/SettingsContext"; import { SettingsProvider, useSettings } from "@/contexts/SettingsContext";
import { MusicProvider } from "@/contexts/MusicContext"; import { MusicProvider } from "@/contexts/MusicContext";
import { AchievementsProvider } from "@/contexts/AchievementsContext"; import { AchievementsProvider } from "@/contexts/AchievementsContext";
import { AudioAnalyzerProvider } from "@/contexts/AudioAnalyzerContext";
// Import Miner and Job classes // Import Miner and Job classes
import Miner from '../miner/src/js/miner'; import Miner from '../miner/src/js/miner';
@ -30,6 +31,7 @@ const Pacman = lazy(() => import("./pages/Pacman"));
const Snake = lazy(() => import("./pages/Snake")); const Snake = lazy(() => import("./pages/Snake"));
const Breakout = lazy(() => import("./pages/Breakout")); const Breakout = lazy(() => import("./pages/Breakout"));
const Music = lazy(() => import("./pages/Music")); const Music = lazy(() => import("./pages/Music"));
const Oscilloscope = lazy(() => import("./pages/Oscilloscope"));
const AIChat = lazy(() => import("./pages/AIChat")); const AIChat = lazy(() => import("./pages/AIChat"));
const Achievements = lazy(() => import("./pages/Achievements")); const Achievements = lazy(() => import("./pages/Achievements"));
const Credits = lazy(() => import("./pages/Credits")); const Credits = lazy(() => import("./pages/Credits"));
@ -125,6 +127,7 @@ const AppContent = () => {
<Route path="games/breakout" element={<Breakout />} /> <Route path="games/breakout" element={<Breakout />} />
<Route path="faq" element={<FAQ />} /> <Route path="faq" element={<FAQ />} />
<Route path="music" element={<Music />} /> <Route path="music" element={<Music />} />
<Route path="oscilloscope" element={<Oscilloscope />} />
<Route path="ai" element={<AIChat />} /> <Route path="ai" element={<AIChat />} />
<Route path="achievements" element={<Achievements />} /> <Route path="achievements" element={<Achievements />} />
<Route path="credits" element={<Credits />} /> <Route path="credits" element={<Credits />} />
@ -137,6 +140,7 @@ const AppContent = () => {
const App = () => ( const App = () => (
<QueryClientProvider client={queryClient}> <QueryClientProvider client={queryClient}>
<AudioAnalyzerProvider>
<SettingsProvider> <SettingsProvider>
<MusicProvider> <MusicProvider>
<TooltipProvider> <TooltipProvider>
@ -150,6 +154,7 @@ const App = () => (
</TooltipProvider> </TooltipProvider>
</MusicProvider> </MusicProvider>
</SettingsProvider> </SettingsProvider>
</AudioAnalyzerProvider>
</QueryClientProvider> </QueryClientProvider>
); );

View File

@ -0,0 +1,78 @@
import { useCallback } from 'react';
import { Upload, Music } from 'lucide-react';
import { cn } from '@/lib/utils';
interface AudioUploaderProps {
onFileSelect: (file: File) => void;
isLoading: boolean;
fileName: string | null;
}
export function AudioUploader({ onFileSelect, isLoading, fileName }: AudioUploaderProps) {
const handleDrop = useCallback((e: React.DragEvent) => {
e.preventDefault();
const file = e.dataTransfer.files[0];
if (file && file.type.startsWith('audio/')) {
onFileSelect(file);
}
}, [onFileSelect]);
const handleFileInput = useCallback((e: React.ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0];
if (file) {
onFileSelect(file);
}
}, [onFileSelect]);
const handleDragOver = (e: React.DragEvent) => {
e.preventDefault();
};
return (
<div
className={cn(
"relative border-2 border-dashed border-primary/40 rounded-lg py-24 px-8 text-center",
"hover:border-primary/70 transition-all duration-300 cursor-pointer",
"bg-secondary/20 hover:bg-secondary/30",
isLoading && "opacity-50 pointer-events-none"
)}
onDrop={handleDrop}
onDragOver={handleDragOver}
>
<input
type="file"
accept="audio/*"
onChange={handleFileInput}
className="absolute inset-0 w-full h-full opacity-0 cursor-pointer"
disabled={isLoading}
/>
<div className="flex flex-col items-center gap-4">
{fileName ? (
<>
<Music className="w-12 h-12 text-primary phosphor-glow" />
<div>
<p className="text-lg font-crt text-primary text-glow">{fileName}</p>
<p className="text-sm text-muted-foreground mt-1">Click or drop to replace</p>
</div>
</>
) : (
<>
<Upload className="w-12 h-12 text-primary/60" />
<div>
<p className="text-lg font-crt text-primary/80">Drop audio file here</p>
<p className="text-sm text-muted-foreground mt-1">or click to browse</p>
<p className="text-xs text-muted-foreground mt-2">MP3, WAV, FLAC, OGG supported</p>
</div>
</>
)}
{isLoading && (
<div className="absolute inset-0 flex items-center justify-center bg-background/50 rounded-lg">
<div className="w-8 h-8 border-2 border-primary border-t-transparent rounded-full animate-spin" />
</div>
)}
</div>
</div>
);
}

View File

@ -1,27 +1,27 @@
import { Outlet } from 'react-router-dom'; import { Outlet, useLocation } from 'react-router-dom';
import { motion } from 'framer-motion'; import { motion } from 'framer-motion';
import Sidebar from './Sidebar'; import Sidebar from './Sidebar';
import { MiniOscilloscope } from './MiniOscilloscope';
const MainLayout = () => { const MainLayout = () => {
return <motion.div initial={{ const location = useLocation();
opacity: 0, // Don't show mini oscilloscope on the oscilloscope page itself
scale: 0.95 const showMiniOscilloscope = location.pathname !== '/oscilloscope';
}} animate={{
opacity: 1, return (
scale: 1 <motion.div
}} transition={{ initial={{ opacity: 0, scale: 0.95 }}
duration: 0.5 animate={{ opacity: 1, scale: 1 }}
}} className="relative z-10 flex flex-col items-center pt-8 md:pt-12 px-4 w-full"> transition={{ duration: 0.5 }}
className="relative z-10 flex flex-col items-center pt-8 md:pt-12 px-4 w-full pb-20"
>
{/* Branding */} {/* Branding */}
<motion.h1 initial={{ <motion.h1
opacity: 0, initial={{ opacity: 0, y: -20 }}
y: -20 animate={{ opacity: 1, y: 0 }}
}} animate={{ transition={{ delay: 0.3, duration: 0.5 }}
opacity: 1, className="font-minecraft text-4xl md:text-5xl lg:text-6xl text-primary text-glow-strong mb-6"
y: 0 >
}} transition={{
delay: 0.3,
duration: 0.5
}} className="font-minecraft text-4xl md:text-5xl lg:text-6xl text-primary text-glow-strong mb-6">
<span className="inline-block translate-y-[0.35em]">~</span>$ whoami Jory <span className="inline-block translate-y-[0.35em]">~</span>$ whoami Jory
</motion.h1> </motion.h1>
@ -34,6 +34,11 @@ const MainLayout = () => {
<Outlet /> <Outlet />
</main> </main>
</div> </div>
</motion.div>;
{/* Mini Oscilloscope Bar */}
{showMiniOscilloscope && <MiniOscilloscope />}
</motion.div>
);
}; };
export default MainLayout; export default MainLayout;

View File

@ -0,0 +1,177 @@
import { useEffect, useRef } from 'react';
import { useNavigate, useLocation } from 'react-router-dom';
import { useSettings } from '@/contexts/SettingsContext';
import { useAudioAnalyzer } from '@/contexts/AudioAnalyzerContext';
// Get CSS variable value
function getCSSVar(name: string): string {
return getComputedStyle(document.documentElement).getPropertyValue(name).trim();
}
export function MiniOscilloscope() {
const canvasRef = useRef<HTMLCanvasElement>(null);
const animationRef = useRef<number>();
const navigate = useNavigate();
const location = useLocation();
const { playSound } = useSettings();
const { analyzerNode } = useAudioAnalyzer();
// Draw waveform
useEffect(() => {
const canvas = canvasRef.current;
if (!canvas) return;
const ctx = canvas.getContext('2d');
if (!ctx) return;
const draw = () => {
const width = canvas.width;
const height = canvas.height;
// Get theme colors
const primaryHsl = getCSSVar('--primary');
const primaryColor = primaryHsl ? `hsl(${primaryHsl})` : 'hsl(120, 100%, 50%)';
const primaryColorDim = primaryHsl ? `hsl(${primaryHsl} / 0.3)` : 'hsl(120, 100%, 50%, 0.3)';
const primaryColorFaint = primaryHsl ? `hsl(${primaryHsl} / 0.1)` : 'hsl(120, 100%, 50%, 0.1)';
const bgHsl = getCSSVar('--background');
const bgColor = bgHsl ? `hsl(${bgHsl} / 0.8)` : 'rgba(0, 0, 0, 0.6)';
// Clear with transparent background
ctx.clearRect(0, 0, width, height);
// Draw background with theme color
ctx.fillStyle = bgColor;
ctx.fillRect(0, 0, width, height);
// Draw grid lines with theme color
ctx.strokeStyle = primaryColorFaint;
ctx.lineWidth = 1;
// Vertical grid lines
for (let x = 0; x < width; x += 20) {
ctx.beginPath();
ctx.moveTo(x, 0);
ctx.lineTo(x, height);
ctx.stroke();
}
// Center line with theme color
ctx.strokeStyle = primaryColorDim;
ctx.beginPath();
ctx.moveTo(0, height / 2);
ctx.lineTo(width, height / 2);
ctx.stroke();
// Draw waveform from analyzer or flat line
if (analyzerNode) {
const bufferLength = analyzerNode.frequencyBinCount;
const dataArray = new Uint8Array(bufferLength);
analyzerNode.getByteTimeDomainData(dataArray);
// Check if there's actual audio (not just silence)
const hasAudio = dataArray.some(v => Math.abs(v - 128) > 2);
ctx.strokeStyle = primaryColor;
ctx.lineWidth = 2;
ctx.shadowColor = primaryColor;
ctx.shadowBlur = hasAudio ? 10 : 0;
ctx.beginPath();
if (hasAudio) {
const sliceWidth = width / bufferLength;
let x = 0;
for (let i = 0; i < bufferLength; i++) {
const v = dataArray[i] / 128.0;
const y = (v * height) / 2;
if (i === 0) {
ctx.moveTo(x, y);
} else {
ctx.lineTo(x, y);
}
x += sliceWidth;
}
} else {
// Flat line when no audio
ctx.moveTo(0, height / 2);
ctx.lineTo(width, height / 2);
}
ctx.stroke();
ctx.shadowBlur = 0;
} else {
// Flat line when no analyzer
ctx.strokeStyle = primaryColor;
ctx.lineWidth = 2;
ctx.beginPath();
ctx.moveTo(0, height / 2);
ctx.lineTo(width, height / 2);
ctx.stroke();
}
animationRef.current = requestAnimationFrame(draw);
};
draw();
return () => {
if (animationRef.current) {
cancelAnimationFrame(animationRef.current);
}
};
}, [analyzerNode]);
// Handle resize
useEffect(() => {
const canvas = canvasRef.current;
if (!canvas) return;
const resizeCanvas = () => {
const container = canvas.parentElement;
if (container) {
canvas.width = container.clientWidth;
canvas.height = container.clientHeight;
}
};
resizeCanvas();
window.addEventListener('resize', resizeCanvas);
return () => {
window.removeEventListener('resize', resizeCanvas);
};
}, []);
const handleClick = () => {
playSound('click');
navigate('/oscilloscope');
};
// Hide on oscilloscope page
if (location.pathname === '/oscilloscope') {
return null;
}
return (
<div
onClick={handleClick}
className="fixed bottom-6 left-1/2 -translate-x-1/2 w-[200px] h-[50px] md:w-[600px] md:h-[80px] z-50 cursor-pointer group"
title="Open Oscilloscope"
>
<div className="relative w-full h-full rounded-lg border border-primary/50 overflow-hidden bg-background/80 backdrop-blur-sm transition-all duration-300 group-hover:border-primary group-hover:shadow-[0_0_20px_hsl(var(--primary)/0.4)]">
<canvas
ref={canvasRef}
className="w-full h-full"
/>
{/* Hover overlay */}
<div className="absolute inset-0 flex items-center justify-center bg-primary/0 group-hover:bg-primary/10 transition-colors duration-300">
<span className="opacity-0 group-hover:opacity-100 transition-opacity duration-300 font-crt text-xs text-primary text-glow">
OPEN OSCILLOSCOPE
</span>
</div>
</div>
</div>
);
}

View File

@ -1,4 +1,4 @@
import { useEffect } from 'react'; import { useEffect, useRef } from 'react';
import { Play, Pause, Volume2, Music2, SkipBack, SkipForward, Loader2 } from 'lucide-react'; import { Play, Pause, Volume2, Music2, SkipBack, SkipForward, Loader2 } from 'lucide-react';
import { useSettings } from '@/contexts/SettingsContext'; import { useSettings } from '@/contexts/SettingsContext';
import { useMusic } from '@/contexts/MusicContext'; import { useMusic } from '@/contexts/MusicContext';
@ -8,6 +8,7 @@ import { useState } from 'react';
const MusicPlayer = () => { const MusicPlayer = () => {
const [isExpanded, setIsExpanded] = useState(false); const [isExpanded, setIsExpanded] = useState(false);
const containerRef = useRef<HTMLDivElement>(null);
const { playSound, soundEnabled } = useSettings(); const { playSound, soundEnabled } = useSettings();
const { const {
isPlaying, isPlaying,
@ -29,6 +30,25 @@ const MusicPlayer = () => {
} }
}, [isExpanded, fetchStations]); }, [isExpanded, fetchStations]);
// Close on click outside (for mobile)
useEffect(() => {
const handleClickOutside = (event: MouseEvent | TouchEvent) => {
if (containerRef.current && !containerRef.current.contains(event.target as Node)) {
setIsExpanded(false);
}
};
if (isExpanded) {
document.addEventListener('mousedown', handleClickOutside);
document.addEventListener('touchstart', handleClickOutside);
}
return () => {
document.removeEventListener('mousedown', handleClickOutside);
document.removeEventListener('touchstart', handleClickOutside);
};
}, [isExpanded]);
const handleButtonClick = (action: () => void) => { const handleButtonClick = (action: () => void) => {
if (soundEnabled) { if (soundEnabled) {
playSound('click'); playSound('click');
@ -36,8 +56,13 @@ const MusicPlayer = () => {
action(); action();
}; };
const handleToggleExpand = () => {
setIsExpanded(!isExpanded);
};
return ( return (
<div <div
ref={containerRef}
className="fixed bottom-4 left-4 z-50" className="fixed bottom-4 left-4 z-50"
onMouseEnter={() => setIsExpanded(true)} onMouseEnter={() => setIsExpanded(true)}
onMouseLeave={() => setIsExpanded(false)} onMouseLeave={() => setIsExpanded(false)}
@ -156,6 +181,11 @@ const MusicPlayer = () => {
animate={{ opacity: 1 }} animate={{ opacity: 1 }}
exit={{ opacity: 0 }} exit={{ opacity: 0 }}
className="p-3 bg-background/90 border border-primary box-glow cursor-pointer" className="p-3 bg-background/90 border border-primary box-glow cursor-pointer"
onClick={handleToggleExpand}
onTouchEnd={(e) => {
e.preventDefault();
handleToggleExpand();
}}
> >
<Music2 className="w-5 h-5 text-primary text-glow" /> <Music2 className="w-5 h-5 text-primary text-glow" />
{isPlaying && ( {isPlaying && (

View File

@ -0,0 +1,469 @@
import { useState, useCallback, useEffect, useRef } from 'react';
import { AudioUploader } from './AudioUploader';
import { OscilloscopeControls, LiveDisplaySettings } from './OscilloscopeControls';
import { OscilloscopeDisplay } from './OscilloscopeDisplay';
import { useAudioAnalyzer } from '@/hooks/useAudioAnalyzer';
import { useOscilloscopeRenderer } from '@/hooks/useOscilloscopeRenderer';
import { useVideoExporter } from '@/hooks/useVideoExporter';
import { useSettings } from '@/contexts/SettingsContext';
import { Pause, Play } from 'lucide-react';
import { Button } from '@/components/ui/button';
import { Slider } from '@/components/ui/slider';
// Format time in mm:ss format
const formatTime = (seconds: number): string => {
const mins = Math.floor(seconds / 60);
const secs = Math.floor(seconds % 60);
return `${mins}:${secs.toString().padStart(2, '0')}`;
};
export function Oscilloscope() {
const [mode, setMode] = useState<'combined' | 'separate' | 'all'>('combined');
const [liveSettings, setLiveSettings] = useState<LiveDisplaySettings>({
lineThickness: 2,
showGrid: true,
glowIntensity: 1,
displayMode: 'combined',
visualizationMode: 'waveform',
});
const [isMicActive, setIsMicActive] = useState(false);
const [isPlaying, setIsPlaying] = useState(false);
const [currentTime, setCurrentTime] = useState(0);
const [playbackSpeed, setPlaybackSpeed] = useState(1);
const [isLooping, setIsLooping] = useState(false);
const [seekPosition, setSeekPosition] = useState(0);
const [exportResolution, setExportResolution] = useState<'640x480' | '1280x720' | '1920x1080'>('1920x1080');
const [exportFps, setExportFps] = useState<24 | 30 | 60>(60);
const [exportQuality, setExportQuality] = useState<'low' | 'medium' | 'high'>('medium');
const [showMicCalibration, setShowMicCalibration] = useState(false);
const [micLevel, setMicLevel] = useState(0);
const [micGain, setMicGain] = useState(1);
const [micGainNode, setMicGainNode] = useState<GainNode | null>(null);
const isMicActiveRef = useRef(false);
// Audio playback refs
const audioRef = useRef<HTMLAudioElement | null>(null);
const audioUrlRef = useRef<string | null>(null);
const { audioData, isLoading, fileName, originalFile, loadAudioFile, reset: resetAudio } = useAudioAnalyzer();
const { isExporting, progress, exportedUrl, exportVideo, reset: resetExport } = useVideoExporter();
const { playSound } = useSettings();
// Update mic gain when it changes
useEffect(() => {
if (micGainNode) {
micGainNode.gain.value = micGain;
}
}, [micGain, micGainNode]);
// Real-time microphone input
const [micStream, setMicStream] = useState<MediaStream | null>(null);
const [micAnalyzer, setMicAnalyzer] = useState<AnalyserNode | null>(null);
// Create audio element when file is loaded
useEffect(() => {
if (originalFile) {
// Clean up previous audio URL
if (audioUrlRef.current) {
URL.revokeObjectURL(audioUrlRef.current);
}
// Create new audio element
const url = URL.createObjectURL(originalFile);
audioUrlRef.current = url;
const audio = new Audio(url);
audio.loop = isLooping;
audio.playbackRate = playbackSpeed;
audio.addEventListener('timeupdate', () => {
setCurrentTime(audio.currentTime);
if (audioData) {
const position = audio.currentTime / audioData.duration;
setSeekPosition(position);
}
});
audio.addEventListener('ended', () => {
if (!isLooping) {
setIsPlaying(false);
setCurrentTime(0);
setSeekPosition(0);
}
});
audioRef.current = audio;
return () => {
audio.pause();
audio.src = '';
if (audioUrlRef.current) {
URL.revokeObjectURL(audioUrlRef.current);
audioUrlRef.current = null;
}
};
}
}, [originalFile, audioData]);
// Update audio playback state
useEffect(() => {
if (audioRef.current) {
audioRef.current.loop = isLooping;
audioRef.current.playbackRate = playbackSpeed;
}
}, [isLooping, playbackSpeed]);
// Handle play/pause
useEffect(() => {
if (audioRef.current) {
if (isPlaying) {
audioRef.current.play().catch(console.error);
} else {
audioRef.current.pause();
}
}
}, [isPlaying]);
const handleFileSelect = useCallback((file: File) => {
loadAudioFile(file);
if (isMicActive) {
setIsMicActive(false);
}
}, [loadAudioFile, isMicActive]);
const toggleMic = useCallback(async () => {
playSound('click');
if (isMicActive) {
// Stop microphone
isMicActiveRef.current = false;
if (micStream) {
micStream.getTracks().forEach(track => track.stop());
setMicStream(null);
}
setMicAnalyzer(null);
setMicGainNode(null);
setIsMicActive(false);
setMicLevel(0);
setShowMicCalibration(false);
resetAudio();
} else {
// Start microphone
try {
const stream = await navigator.mediaDevices.getUserMedia({ audio: true });
setMicStream(stream);
const audioContext = new AudioContext();
const analyser = audioContext.createAnalyser();
analyser.fftSize = 2048;
analyser.smoothingTimeConstant = 0.8;
const gainNode = audioContext.createGain();
gainNode.gain.value = micGain;
setMicGainNode(gainNode);
const source = audioContext.createMediaStreamSource(stream);
source.connect(gainNode);
gainNode.connect(analyser);
// Start monitoring mic levels
isMicActiveRef.current = true;
const dataArray = new Uint8Array(analyser.frequencyBinCount);
const monitorLevels = () => {
if (isMicActiveRef.current && analyser) {
analyser.getByteFrequencyData(dataArray);
const average = dataArray.reduce((a, b) => a + b) / dataArray.length;
setMicLevel(average / 255); // Normalize to 0-1
requestAnimationFrame(monitorLevels);
}
};
monitorLevels();
setMicAnalyzer(analyser);
setIsMicActive(true);
resetAudio(); // Clear any loaded file
} catch (error) {
console.error('Error accessing microphone:', error);
alert('Could not access microphone. Please check permissions.');
}
}
}, [isMicActive, micStream, playSound, resetAudio, micGain]);
const handlePreview = useCallback(() => {
playSound('click');
setIsPlaying(!isPlaying);
}, [isPlaying, playSound]);
const handleSeek = useCallback((position: number) => {
if (audioData && audioRef.current && position >= 0) {
const newTime = position * audioData.duration;
audioRef.current.currentTime = newTime;
setCurrentTime(newTime);
setSeekPosition(position);
}
}, [audioData]);
const handlePlaybackSpeedChange = useCallback((speed: number) => {
setPlaybackSpeed(speed);
}, []);
const handleLoopingChange = useCallback((looping: boolean) => {
setIsLooping(looping);
}, []);
const handleExportResolutionChange = useCallback((resolution: string) => {
setExportResolution(resolution as '640x480' | '1280x720' | '1920x1080');
}, []);
const handleExportFpsChange = useCallback((fps: number) => {
setExportFps(fps as 24 | 30 | 60);
}, []);
const handleExportQualityChange = useCallback((quality: string) => {
setExportQuality(quality as 'low' | 'medium' | 'high');
}, []);
const handleGenerate = useCallback(async () => {
if (!audioData || !originalFile) return;
const [width, height] = exportResolution.split('x').map(Number);
await exportVideo(audioData, originalFile, {
width,
height,
fps: exportFps,
mode,
audioFile: originalFile,
quality: exportQuality,
});
}, [audioData, originalFile, exportVideo, mode, exportResolution, exportFps, exportQuality]);
const handleReset = useCallback(() => {
playSound('click');
setIsPlaying(false);
resetAudio();
resetExport();
}, [playSound, resetAudio, resetExport]);
const canPreview = (audioData !== null || isMicActive) && !isExporting;
const canGenerate = audioData !== null && originalFile !== null && !isExporting && !isLoading;
return (
<div className="space-y-6 max-w-7xl mx-auto">
{/* Header */}
<div className="text-center space-y-4">
<h2 className="font-minecraft text-2xl md:text-3xl text-primary text-glow-strong">
Audio Oscilloscope
</h2>
<p className="font-pixel text-foreground/80">
Visualize audio waveforms in real-time
</p>
</div>
{/* Main Content: Display + Controls Side by Side */}
<div className="grid grid-cols-1 xl:grid-cols-[1fr_320px] gap-6">
{/* Oscilloscope Display */}
<div className="flex flex-col items-center gap-4 order-2 xl:order-1">
<OscilloscopeDisplay
audioData={audioData}
micAnalyzer={micAnalyzer}
mode={liveSettings.displayMode}
isPlaying={isPlaying}
playbackSpeed={playbackSpeed}
isLooping={isLooping}
audioElementRef={audioRef}
onPlaybackEnd={() => {
setIsPlaying(false);
setCurrentTime(0);
setSeekPosition(0);
}}
onSeek={handleSeek}
liveSettings={liveSettings}
/>
{/* Audio Uploader - below controls on desktop */}
<div className="w-full mt-4">
<AudioUploader
onFileSelect={handleFileSelect}
isLoading={isLoading}
fileName={fileName}
/>
</div>
{/* Audio Playback Controls */}
{audioData && originalFile && (
<div className="w-full max-w-3xl space-y-2 px-4">
{/* Play/Pause and Time Display */}
<div className="flex items-center gap-4">
<Button
onClick={handlePreview}
variant="outline"
size="sm"
className="font-crt border-primary/50 hover:bg-primary/10"
disabled={isExporting}
>
{isPlaying ? <Pause size={16} /> : <Play size={16} />}
</Button>
<span className="font-mono-crt text-sm text-foreground/80 min-w-[80px]">
{formatTime(currentTime)} / {formatTime(audioData.duration)}
</span>
{/* Progress Bar */}
<div className="flex-1">
<Slider
value={[seekPosition * 100]}
onValueChange={(value) => handleSeek(value[0] / 100)}
max={100}
step={0.1}
className="cursor-pointer"
/>
</div>
</div>
</div>
)}
</div>
{/* Control Panel */}
<div className="order-1 xl:order-2">
<OscilloscopeControls
mode={mode}
onModeChange={setMode}
canGenerate={canGenerate}
isGenerating={isExporting}
progress={progress}
exportedUrl={exportedUrl}
onGenerate={handleGenerate}
onReset={handleReset}
isPlaying={isPlaying}
onPreview={handlePreview}
canPreview={canPreview}
playbackSpeed={playbackSpeed}
onPlaybackSpeedChange={handlePlaybackSpeedChange}
isLooping={isLooping}
onLoopingChange={handleLoopingChange}
exportResolution={exportResolution}
onExportResolutionChange={handleExportResolutionChange}
exportFps={exportFps}
onExportFpsChange={handleExportFpsChange}
exportQuality={exportQuality}
onExportQualityChange={handleExportQualityChange}
liveSettings={liveSettings}
onLiveSettingsChange={setLiveSettings}
isMicActive={isMicActive}
onToggleMic={toggleMic}
/>
</div>
</div>
{/* Microphone Calibration */}
{showMicCalibration && isMicActive && (
<div className="bg-card border border-border rounded-lg p-4 max-w-md mx-auto">
<h3 className="font-crt text-lg text-primary text-glow mb-4">Microphone Calibration</h3>
<div className="space-y-4">
{/* Level Indicator */}
<div>
<div className="flex justify-between text-sm font-mono-crt mb-2">
<span>Input Level</span>
<span>{Math.round(micLevel * 100)}%</span>
</div>
<div className="w-full bg-secondary rounded-full h-3">
<div
className="bg-primary h-3 rounded-full transition-all duration-100"
style={{ width: `${micLevel * 100}%` }}
/>
</div>
<div className="flex justify-between text-xs text-muted-foreground mt-1">
<span>Quiet</span>
<span className={micLevel > 0.7 ? 'text-red-400' : micLevel > 0.5 ? 'text-yellow-400' : 'text-green-400'}>
{micLevel > 0.7 ? 'Too Loud' : micLevel > 0.5 ? 'Good' : 'Too Quiet'}
</span>
<span>Loud</span>
</div>
</div>
{/* Gain Control */}
<div>
<div className="flex justify-between text-sm font-mono-crt mb-2">
<span>Gain</span>
<span>{micGain.toFixed(1)}x</span>
</div>
<input
type="range"
min="0.1"
max="3"
step="0.1"
value={micGain}
onChange={(e) => setMicGain(Number(e.target.value))}
className="w-full"
/>
</div>
<p className="text-xs text-muted-foreground font-mono-crt">
Speak into your microphone and adjust gain until the level shows "Good" (green).
The bar should peak around 50-70% when speaking normally.
</p>
</div>
</div>
)}
{showMicCalibration && isMicActive && (
<div className="bg-card border border-border rounded-lg p-4 max-w-md mx-auto">
<h3 className="font-crt text-lg text-primary text-glow mb-4">Microphone Calibration</h3>
<div className="space-y-4">
{/* Level Indicator */}
<div>
<div className="flex justify-between text-sm font-mono-crt mb-2">
<span>Input Level</span>
<span>{Math.round(micLevel * 100)}%</span>
</div>
<div className="w-full bg-secondary rounded-full h-3">
<div
className="bg-primary h-3 rounded-full transition-all duration-100"
style={{ width: `${micLevel * 100}%` }}
/>
</div>
<div className="flex justify-between text-xs text-muted-foreground mt-1">
<span>Quiet</span>
<span className={micLevel > 0.7 ? 'text-red-400' : micLevel > 0.5 ? 'text-yellow-400' : 'text-green-400'}>
{micLevel > 0.7 ? 'Too Loud' : micLevel > 0.5 ? 'Good' : 'Too Quiet'}
</span>
<span>Loud</span>
</div>
</div>
{/* Gain Control */}
<div>
<div className="flex justify-between text-sm font-mono-crt mb-2">
<span>Gain</span>
<span>{micGain.toFixed(1)}x</span>
</div>
<input
type="range"
min="0.1"
max="3"
step="0.1"
value={micGain}
onChange={(e) => setMicGain(Number(e.target.value))}
className="w-full"
/>
</div>
<p className="text-xs text-muted-foreground font-mono-crt">
Speak into your microphone and adjust gain until the level shows "Good" (green).
The bar should peak around 50-70% when speaking normally.
</p>
</div>
</div>
)}
{/* Status Info */}
{(isLoading || isExporting) && (
<div className="text-center text-muted-foreground font-mono-crt text-sm">
{isLoading && "Loading audio..."}
{isExporting && `Exporting video... ${progress}%`}
</div>
)}
</div>
);
}

View File

@ -0,0 +1,382 @@
import { Play, Download, RotateCcw, Mic, MicOff } from 'lucide-react';
import { Button } from '@/components/ui/button';
import { Label } from '@/components/ui/label';
import { Progress } from '@/components/ui/progress';
import { RadioGroup, RadioGroupItem } from '@/components/ui/radio-group';
import type { OscilloscopeMode } from '@/hooks/useOscilloscopeRenderer';
export type VisualizationMode = 'waveform' | 'spectrum' | 'both';
export interface LiveDisplaySettings {
lineThickness: number;
showGrid: boolean;
glowIntensity: number;
displayMode: OscilloscopeMode;
visualizationMode: VisualizationMode;
}
interface ControlPanelProps {
mode: OscilloscopeMode;
onModeChange: (mode: OscilloscopeMode) => void;
canGenerate: boolean;
isGenerating: boolean;
progress: number;
exportedUrl: string | null;
onGenerate: () => void;
onReset: () => void;
isPlaying: boolean;
onPreview: () => void;
canPreview: boolean;
playbackSpeed: number;
onPlaybackSpeedChange: (speed: number) => void;
isLooping: boolean;
onLoopingChange: (looping: boolean) => void;
exportResolution: string;
onExportResolutionChange: (resolution: string) => void;
exportFps: number;
onExportFpsChange: (fps: number) => void;
exportQuality: string;
onExportQualityChange: (quality: string) => void;
liveSettings: LiveDisplaySettings;
onLiveSettingsChange: (settings: LiveDisplaySettings) => void;
isMicActive: boolean;
onToggleMic: () => void;
}
export function OscilloscopeControls({
mode,
onModeChange,
canGenerate,
isGenerating,
progress,
exportedUrl,
onGenerate,
onReset,
isPlaying,
onPreview,
canPreview,
playbackSpeed,
onPlaybackSpeedChange,
isLooping,
onLoopingChange,
exportResolution,
onExportResolutionChange,
exportFps,
onExportFpsChange,
exportQuality,
onExportQualityChange,
liveSettings,
onLiveSettingsChange,
isMicActive,
onToggleMic,
}: ControlPanelProps) {
return (
<div className="flex flex-col gap-6 p-6 bg-card border border-border rounded-lg">
{/* Live Display Options */}
<div className="space-y-3">
<Label className="font-crt text-lg text-primary text-glow">LIVE DISPLAY</Label>
{/* Visualization Mode */}
<div className="flex items-center justify-between">
<span className="font-mono-crt text-sm text-foreground/90">Visualization</span>
<div className="flex gap-1">
{(['waveform', 'spectrum', 'both'] as VisualizationMode[]).map((vizMode) => (
<button
key={vizMode}
onClick={() => onLiveSettingsChange({ ...liveSettings, visualizationMode: vizMode })}
className={`px-2 py-1 text-xs font-mono-crt border transition-all duration-300 capitalize ${
liveSettings.visualizationMode === vizMode
? 'border-primary text-primary bg-primary/10'
: 'border-primary/50 text-primary/70 hover:border-primary hover:text-primary'
}`}
>
{vizMode}
</button>
))}
</div>
</div>
{/* Display Mode */}
<RadioGroup
value={liveSettings.displayMode}
onValueChange={(value) => onLiveSettingsChange({ ...liveSettings, displayMode: value as OscilloscopeMode })}
className="space-y-2"
>
<div className="flex items-center space-x-3">
<RadioGroupItem value="combined" id="live-combined" className="border-primary" />
<Label htmlFor="live-combined" className="font-mono-crt text-sm cursor-pointer">
Combined (L+R waveform)
</Label>
</div>
<div className="flex items-center space-x-3">
<RadioGroupItem value="all" id="live-xy" className="border-primary" />
<Label htmlFor="live-xy" className="font-mono-crt text-sm cursor-pointer">
XY Mode (Lissajous)
</Label>
</div>
</RadioGroup>
{/* Line Thickness */}
<div className="flex items-center justify-between">
<span className="font-mono-crt text-sm text-foreground/90">Line Thickness</span>
<div className="flex gap-1">
{[1, 2, 3, 4].map((thickness) => (
<button
key={thickness}
onClick={() => onLiveSettingsChange({ ...liveSettings, lineThickness: thickness })}
className={`px-2 py-1 text-xs font-mono-crt border transition-all duration-300 ${
liveSettings.lineThickness === thickness
? 'border-primary text-primary bg-primary/10'
: 'border-primary/50 text-primary/70 hover:border-primary hover:text-primary'
}`}
>
{thickness}px
</button>
))}
</div>
</div>
{/* Show Grid */}
<div className="flex items-center justify-between">
<span className="font-mono-crt text-sm text-foreground/90">Show Grid</span>
<button
onClick={() => onLiveSettingsChange({ ...liveSettings, showGrid: !liveSettings.showGrid })}
className={`w-12 h-6 rounded-full border border-primary transition-all duration-300 ${
liveSettings.showGrid ? 'bg-primary' : 'bg-transparent'
}`}
>
<div
className={`w-4 h-4 rounded-full bg-background border border-primary transition-transform duration-300 ${
liveSettings.showGrid ? 'translate-x-6' : 'translate-x-1'
}`}
/>
</button>
</div>
{/* Glow Intensity */}
<div className="flex items-center justify-between">
<span className="font-mono-crt text-sm text-foreground/90">Glow</span>
<div className="flex gap-1">
{[0, 1, 2, 3].map((glow) => (
<button
key={glow}
onClick={() => onLiveSettingsChange({ ...liveSettings, glowIntensity: glow })}
className={`px-2 py-1 text-xs font-mono-crt border transition-all duration-300 ${
liveSettings.glowIntensity === glow
? 'border-primary text-primary bg-primary/10'
: 'border-primary/50 text-primary/70 hover:border-primary hover:text-primary'
}`}
>
{glow === 0 ? 'Off' : glow}
</button>
))}
</div>
</div>
</div>
{/* Audio Input */}
<div className="space-y-3">
<Label className="font-crt text-lg text-primary text-glow">AUDIO INPUT</Label>
{/* Microphone Toggle */}
<Button
onClick={onToggleMic}
variant={isMicActive ? "default" : "outline"}
className={`w-full flex items-center justify-center gap-2 font-crt ${
isMicActive
? 'bg-primary text-primary-foreground'
: 'border-primary/50 hover:bg-primary/10'
}`}
>
{isMicActive ? <MicOff size={16} /> : <Mic size={16} />}
{isMicActive ? 'STOP MICROPHONE' : 'USE MICROPHONE'}
</Button>
{isMicActive && (
<p className="text-xs text-muted-foreground font-mono-crt text-center">
Real-time input active
</p>
)}
</div>
{/* Playback Controls */}
<div className="space-y-3">
<Label className="font-crt text-lg text-primary text-glow">PLAYBACK CONTROLS</Label>
{/* Playback Speed */}
<div className="flex items-center justify-between">
<span className="font-mono-crt text-sm text-foreground/90">Speed: {playbackSpeed}x</span>
<div className="flex gap-1">
{[0.5, 1, 1.5, 2].map((speed) => (
<button
key={speed}
onClick={() => onPlaybackSpeedChange(speed)}
className={`px-2 py-1 text-xs font-mono-crt border transition-all duration-300 ${
playbackSpeed === speed
? 'border-primary text-primary bg-primary/10'
: 'border-primary/50 text-primary/70 hover:border-primary hover:text-primary'
}`}
>
{speed}x
</button>
))}
</div>
</div>
{/* Looping Toggle */}
<div className="flex items-center justify-between">
<span className="font-mono-crt text-sm text-foreground/90">Looping</span>
<button
onClick={() => onLoopingChange(!isLooping)}
className={`w-12 h-6 rounded-full border border-primary transition-all duration-300 ${
isLooping ? 'bg-primary' : 'bg-transparent'
}`}
>
<div
className={`w-4 h-4 rounded-full bg-background border border-primary transition-transform duration-300 ${
isLooping ? 'translate-x-6' : 'translate-x-1'
}`}
/>
</button>
</div>
</div>
{/* Preview Button */}
<Button
onClick={onPreview}
disabled={!canPreview || isGenerating}
variant="outline"
className="w-full font-crt text-lg h-12 border-primary/50 hover:bg-primary/10 hover:border-primary"
>
<Play className="mr-2 h-5 w-5" />
{isPlaying ? 'PLAYING...' : 'PREVIEW'}
</Button>
{/* Export Options */}
<div className="space-y-3 pt-4 border-t border-border">
<Label className="font-crt text-lg text-primary text-glow">EXPORT OPTIONS</Label>
<p className="font-mono-crt text-xs text-muted-foreground">Settings for video export only</p>
{/* Export Display Mode */}
<div className="flex items-center justify-between">
<span className="font-mono-crt text-sm text-foreground/90">Mode</span>
<select
value={mode}
onChange={(e) => onModeChange(e.target.value as OscilloscopeMode)}
className="bg-background border border-primary/50 text-primary font-mono-crt text-sm px-2 py-1"
>
<option value="combined">Combined</option>
<option value="separate">Separate</option>
<option value="all">All</option>
</select>
</div>
{/* Resolution */}
<div className="flex items-center justify-between">
<span className="font-mono-crt text-sm text-foreground/90">Resolution</span>
<select
value={exportResolution}
onChange={(e) => onExportResolutionChange(e.target.value)}
className="bg-background border border-primary/50 text-primary font-mono-crt text-sm px-2 py-1"
>
<option value="640x480">640×480</option>
<option value="1280x720">1280×720</option>
<option value="1920x1080">1920×1080</option>
</select>
</div>
{/* Frame Rate */}
<div className="flex items-center justify-between">
<span className="font-mono-crt text-sm text-foreground/90">Frame Rate</span>
<select
value={exportFps}
onChange={(e) => onExportFpsChange(Number(e.target.value))}
className="bg-background border border-primary/50 text-primary font-mono-crt text-sm px-2 py-1"
>
<option value={24}>24 FPS</option>
<option value={30}>30 FPS</option>
<option value={60}>60 FPS</option>
</select>
</div>
{/* Quality */}
<div className="flex items-center justify-between">
<span className="font-mono-crt text-sm text-foreground/90">Quality</span>
<select
value={exportQuality}
onChange={(e) => onExportQualityChange(e.target.value)}
className="bg-background border border-primary/50 text-primary font-mono-crt text-sm px-2 py-1"
>
<option value="low">Low</option>
<option value="medium">Medium</option>
<option value="high">High</option>
</select>
</div>
</div>
{/* Generate Button */}
<Button
onClick={onGenerate}
disabled={!canGenerate || isGenerating}
className="w-full font-crt text-lg h-14 bg-primary hover:bg-primary/80 text-primary-foreground"
>
{isGenerating ? (
<>
<div className="w-5 h-5 border-2 border-primary-foreground border-t-transparent rounded-full animate-spin mr-2" />
GENERATING...
</>
) : (
'GENERATE VIDEO'
)}
</Button>
{/* Progress Bar */}
{isGenerating && (
<div className="space-y-2">
<Progress value={progress} className="h-3 bg-secondary" />
<p className="text-center font-mono-crt text-sm text-muted-foreground">
{progress}% complete
</p>
<p className="text-center font-mono-crt text-xs text-muted-foreground/70">
Keep this tab in foreground
</p>
</div>
)}
{/* Download Button */}
{exportedUrl && (
<div className="space-y-3">
<a
href={exportedUrl}
download="oscilloscope-video.webm"
className="block"
>
<Button
variant="outline"
className="w-full font-crt text-lg h-12 border-accent hover:bg-accent/10 text-accent"
>
<Download className="mr-2 h-5 w-5" />
DOWNLOAD VIDEO
</Button>
</a>
<Button
onClick={onReset}
variant="ghost"
className="w-full font-mono-crt text-muted-foreground hover:text-primary"
>
<RotateCcw className="mr-2 h-4 w-4" />
Reset
</Button>
</div>
)}
{/* Info */}
<div className="text-xs text-muted-foreground font-mono-crt space-y-1 pt-4 border-t border-border">
<p>Output: WebM (VP9)</p>
<p>Quality affects video bitrate</p>
<p>Supports files up to 6+ hours</p>
</div>
</div>
);
}

View File

@ -0,0 +1,505 @@
import { useEffect, useRef, useCallback } from 'react';
import type { AudioData } from '@/hooks/useAudioAnalyzer';
import type { OscilloscopeMode } from '@/hooks/useOscilloscopeRenderer';
import { useAudioAnalyzer as useSharedAudioAnalyzer } from '@/contexts/AudioAnalyzerContext';
import type { LiveDisplaySettings, VisualizationMode } from './OscilloscopeControls';
interface OscilloscopeDisplayProps {
audioData: AudioData | null;
micAnalyzer: AnalyserNode | null;
mode: OscilloscopeMode;
isPlaying: boolean;
playbackSpeed: number;
isLooping: boolean;
audioElementRef?: React.RefObject<HTMLAudioElement | null>;
onPlaybackEnd?: () => void;
onSeek?: (position: number) => void;
liveSettings?: LiveDisplaySettings;
}
const WIDTH = 800;
const HEIGHT = 600;
const FPS = 60;
// Get computed CSS color from theme
const getThemeColor = (cssVar: string, fallback: string): string => {
if (typeof window === 'undefined') return fallback;
const root = document.documentElement;
const value = getComputedStyle(root).getPropertyValue(cssVar).trim();
if (value) {
return `hsl(${value})`;
}
return fallback;
};
export function OscilloscopeDisplay({
audioData,
micAnalyzer,
mode,
isPlaying,
playbackSpeed,
isLooping,
audioElementRef,
onPlaybackEnd,
onSeek,
liveSettings
}: OscilloscopeDisplayProps) {
const canvasRef = useRef<HTMLCanvasElement>(null);
const animationRef = useRef<number | null>(null);
const { analyzerNode: sharedAnalyzer } = useSharedAudioAnalyzer();
// Use shared analyzer for live audio (music player, sound effects)
const liveAnalyzer = sharedAnalyzer || micAnalyzer;
// Get settings with defaults
const lineThickness = liveSettings?.lineThickness ?? 2;
const showGrid = liveSettings?.showGrid ?? true;
const glowIntensity = liveSettings?.glowIntensity ?? 1;
const liveDisplayMode = liveSettings?.displayMode ?? 'combined';
const visualizationMode = liveSettings?.visualizationMode ?? 'waveform';
const drawGraticule = useCallback((ctx: CanvasRenderingContext2D) => {
if (!showGrid) return;
const primaryColor = getThemeColor('--primary', '#00ff00');
ctx.strokeStyle = primaryColor;
ctx.globalAlpha = 0.3;
ctx.lineWidth = 1;
// Horizontal center line (X axis)
ctx.beginPath();
ctx.moveTo(0, HEIGHT / 2);
ctx.lineTo(WIDTH, HEIGHT / 2);
ctx.stroke();
// Vertical center line (Y axis)
ctx.beginPath();
ctx.moveTo(WIDTH / 2, 0);
ctx.lineTo(WIDTH / 2, HEIGHT);
ctx.stroke();
ctx.globalAlpha = 1;
}, [showGrid]);
// Draw spectrum bars
const drawSpectrum = useCallback((ctx: CanvasRenderingContext2D, frequencyData: Uint8Array, yOffset: number = 0, heightRatio: number = 1) => {
const primaryColor = getThemeColor('--primary', '#00ff00');
const accentColor = getThemeColor('--accent', '#00ccff');
const barCount = 64;
const barWidth = (WIDTH / barCount) - 2;
const maxBarHeight = (HEIGHT * heightRatio) * 0.8;
// Sample frequency data for the bar count
const step = Math.floor(frequencyData.length / barCount);
for (let i = 0; i < barCount; i++) {
const dataIndex = i * step;
const value = frequencyData[dataIndex] / 255;
const barHeight = value * maxBarHeight;
const x = i * (barWidth + 2);
const y = yOffset + (HEIGHT * heightRatio) - barHeight;
// Create gradient for each bar
const gradient = ctx.createLinearGradient(x, y + barHeight, x, y);
gradient.addColorStop(0, primaryColor);
gradient.addColorStop(1, accentColor);
ctx.fillStyle = gradient;
ctx.fillRect(x, y, barWidth, barHeight);
}
}, []);
const drawFrame = useCallback(() => {
if (!canvasRef.current) return;
// Always allow drawing if we have live analyzer, even without audioData
const hasLiveSource = liveAnalyzer || micAnalyzer;
if (!audioData && !hasLiveSource) return;
const canvas = canvasRef.current;
const ctx = canvas.getContext('2d');
if (!ctx) return;
const primaryColor = getThemeColor('--primary', '#00ff00');
const backgroundColor = getThemeColor('--background', '#000000');
let samplesPerFrame: number = 0;
let endSample: number = 0;
let samplesToAdvance: number = 0;
// Priority: micAnalyzer > liveAnalyzer (shared) > audioData (file)
const activeAnalyzer = micAnalyzer || liveAnalyzer;
if (activeAnalyzer && !audioData) {
// Real-time audio data (mic or music player)
const bufferLength = activeAnalyzer.frequencyBinCount;
const timeDomainData = new Uint8Array(bufferLength);
const frequencyData = new Uint8Array(bufferLength);
activeAnalyzer.getByteTimeDomainData(timeDomainData);
activeAnalyzer.getByteFrequencyData(frequencyData);
// Clear to background color
ctx.fillStyle = backgroundColor;
ctx.fillRect(0, 0, WIDTH, HEIGHT);
// Draw graticule first (only for waveform modes)
if (visualizationMode !== 'spectrum') {
drawGraticule(ctx);
}
// Convert to Float32Array-like for consistency
const liveData = new Float32Array(timeDomainData.length);
for (let i = 0; i < timeDomainData.length; i++) {
liveData[i] = (timeDomainData[i] - 128) / 128; // Normalize to -1 to 1
}
// Apply glow effect
if (glowIntensity > 0) {
ctx.shadowColor = primaryColor;
ctx.shadowBlur = glowIntensity * 8;
} else {
ctx.shadowBlur = 0;
}
ctx.strokeStyle = primaryColor;
ctx.lineWidth = lineThickness;
// Draw based on visualization mode
if (visualizationMode === 'spectrum') {
// Spectrum bars only
ctx.shadowBlur = 0;
drawSpectrum(ctx, frequencyData, 0, 1);
} else if (visualizationMode === 'both') {
// Waveform on top half, spectrum on bottom half
// Draw waveform
if (liveDisplayMode === 'all') {
// XY mode in top half
ctx.beginPath();
const centerX = WIDTH / 2;
const centerY = HEIGHT / 4;
const scale = Math.min(WIDTH, HEIGHT / 2) * 0.35;
for (let i = 0; i < liveData.length - 1; i += 2) {
const x = centerX + liveData[i] * scale;
const y = centerY - liveData[i + 1] * scale;
if (i === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
} else {
// Combined waveform in top half
ctx.beginPath();
const sliceWidth = WIDTH / liveData.length;
let x = 0;
for (let i = 0; i < liveData.length; i++) {
const v = liveData[i];
const y = (v * HEIGHT * 0.4) / 2 + HEIGHT / 4;
if (i === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
x += sliceWidth;
}
ctx.stroke();
}
// Spectrum in bottom half
ctx.shadowBlur = 0;
drawSpectrum(ctx, frequencyData, HEIGHT / 2, 0.5);
// Divider line
ctx.strokeStyle = 'rgba(255,255,255,0.1)';
ctx.beginPath();
ctx.moveTo(0, HEIGHT / 2);
ctx.lineTo(WIDTH, HEIGHT / 2);
ctx.stroke();
} else {
// Waveform only (default)
if (liveDisplayMode === 'all') {
// XY / Lissajous mode - treat odd/even samples as L/R
ctx.beginPath();
const centerX = WIDTH / 2;
const centerY = HEIGHT / 2;
const scale = Math.min(WIDTH, HEIGHT) * 0.4;
for (let i = 0; i < liveData.length - 1; i += 2) {
const x = centerX + liveData[i] * scale;
const y = centerY - liveData[i + 1] * scale;
if (i === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
} else {
// Combined waveform mode
ctx.beginPath();
const sliceWidth = WIDTH / liveData.length;
let x = 0;
for (let i = 0; i < liveData.length; i++) {
const v = liveData[i];
const y = (v * HEIGHT) / 2 + HEIGHT / 2;
if (i === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
x += sliceWidth;
}
ctx.stroke();
}
}
ctx.shadowBlur = 0;
// Request next frame for real-time
animationRef.current = requestAnimationFrame(drawFrame);
return;
}
// File playback mode - need audioData
if (!audioData) return;
// File playback mode - sync with audio element if available
const baseSamplesPerFrame = Math.floor(audioData.sampleRate / FPS);
samplesPerFrame = Math.floor(baseSamplesPerFrame * playbackSpeed);
samplesToAdvance = samplesPerFrame;
// Get current position from audio element (real-time sync at 60fps)
let startSample: number;
if (audioElementRef?.current && !audioElementRef.current.paused) {
const currentTime = audioElementRef.current.currentTime;
startSample = Math.floor((currentTime / audioData.duration) * audioData.leftChannel.length);
} else {
// Fallback: just show first frame when paused
startSample = 0;
}
endSample = Math.min(startSample + samplesPerFrame, audioData.leftChannel.length);
// Clear to background color
ctx.fillStyle = backgroundColor;
ctx.fillRect(0, 0, WIDTH, HEIGHT);
// Draw graticule first
drawGraticule(ctx);
// Apply glow effect
if (glowIntensity > 0) {
ctx.shadowColor = primaryColor;
ctx.shadowBlur = glowIntensity * 8;
} else {
ctx.shadowBlur = 0;
}
ctx.lineWidth = lineThickness;
ctx.lineCap = 'round';
const leftColor = primaryColor;
const rightColor = getThemeColor('--accent', '#00ccff');
const xyColor = getThemeColor('--secondary', '#ff8800');
const dividerColor = 'rgba(255,255,255,0.1)';
if (mode === 'combined') {
// Combined: both channels merged
ctx.strokeStyle = leftColor;
ctx.beginPath();
const samplesPerPixel = samplesPerFrame / WIDTH;
const centerY = HEIGHT / 2;
for (let x = 0; x < WIDTH; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= audioData.leftChannel.length) break;
const sample = (audioData.leftChannel[sampleIndex] + audioData.rightChannel[sampleIndex]) / 2;
const y = centerY - sample * (HEIGHT * 0.4);
if (x === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
} else if (mode === 'separate') {
// Separate: Left on top, Right on bottom
const halfHeight = HEIGHT / 2;
const samplesPerPixel = samplesPerFrame / WIDTH;
// Left channel (top)
ctx.strokeStyle = leftColor;
ctx.beginPath();
const leftCenterY = halfHeight / 2;
for (let x = 0; x < WIDTH; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= audioData.leftChannel.length) break;
const sample = audioData.leftChannel[sampleIndex];
const y = leftCenterY - sample * (halfHeight * 0.35);
if (x === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
// Right channel (bottom)
ctx.strokeStyle = rightColor;
ctx.beginPath();
const rightCenterY = halfHeight + halfHeight / 2;
for (let x = 0; x < WIDTH; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= audioData.rightChannel.length) break;
const sample = audioData.rightChannel[sampleIndex];
const y = rightCenterY - sample * (halfHeight * 0.35);
if (x === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
// Divider
ctx.strokeStyle = dividerColor;
ctx.beginPath();
ctx.moveTo(0, halfHeight);
ctx.lineTo(WIDTH, halfHeight);
ctx.stroke();
} else if (mode === 'all') {
// All: L/R on top row, XY on bottom
const topHeight = HEIGHT / 2;
const bottomHeight = HEIGHT / 2;
const halfWidth = WIDTH / 2;
const samplesPerPixel = samplesPerFrame / halfWidth;
// Left channel (top-left)
ctx.strokeStyle = leftColor;
ctx.beginPath();
const leftCenterY = topHeight / 2;
for (let x = 0; x < halfWidth; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= audioData.leftChannel.length) break;
const sample = audioData.leftChannel[sampleIndex];
const y = leftCenterY - sample * (topHeight * 0.35);
if (x === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
// Right channel (top-right)
ctx.strokeStyle = rightColor;
ctx.beginPath();
const rightCenterY = topHeight / 2;
for (let x = 0; x < halfWidth; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= audioData.rightChannel.length) break;
const sample = audioData.rightChannel[sampleIndex];
const y = rightCenterY - sample * (topHeight * 0.35);
if (x === 0) ctx.moveTo(halfWidth + x, y);
else ctx.lineTo(halfWidth + x, y);
}
ctx.stroke();
// XY mode (bottom half)
ctx.strokeStyle = xyColor;
ctx.beginPath();
const xyCenterX = WIDTH / 2;
const xyCenterY = topHeight + bottomHeight / 2;
const xyScale = Math.min(halfWidth, bottomHeight) * 0.35;
for (let i = startSample; i < endSample; i++) {
const x = xyCenterX + audioData.leftChannel[i] * xyScale;
const y = xyCenterY - audioData.rightChannel[i] * xyScale;
if (i === startSample) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
// Dividers
ctx.strokeStyle = dividerColor;
ctx.beginPath();
ctx.moveTo(0, topHeight);
ctx.lineTo(WIDTH, topHeight);
ctx.stroke();
ctx.beginPath();
ctx.moveTo(halfWidth, 0);
ctx.lineTo(halfWidth, topHeight);
ctx.stroke();
}
ctx.shadowBlur = 0;
// Check if audio ended (when syncing to audio element)
if (audioElementRef?.current) {
if (audioElementRef.current.ended && !isLooping) {
onPlaybackEnd?.();
return;
}
}
animationRef.current = requestAnimationFrame(drawFrame);
}, [audioData, micAnalyzer, liveAnalyzer, mode, drawGraticule, drawSpectrum, onPlaybackEnd, isPlaying, playbackSpeed, isLooping, lineThickness, glowIntensity, liveDisplayMode, visualizationMode, audioElementRef]);
// Initialize canvas
useEffect(() => {
if (!canvasRef.current) return;
const ctx = canvasRef.current.getContext('2d');
if (ctx) {
ctx.fillStyle = '#000000';
ctx.fillRect(0, 0, WIDTH, HEIGHT);
drawGraticule(ctx);
}
}, [drawGraticule]);
// Handle playback - start animation for file playback or live audio
useEffect(() => {
const hasLiveSource = liveAnalyzer || micAnalyzer;
if (isPlaying && audioData) {
// File playback
animationRef.current = requestAnimationFrame(drawFrame);
} else if (hasLiveSource && !audioData) {
// Live audio visualization (music player, sound effects)
animationRef.current = requestAnimationFrame(drawFrame);
} else {
if (animationRef.current) {
cancelAnimationFrame(animationRef.current);
}
}
return () => {
if (animationRef.current) {
cancelAnimationFrame(animationRef.current);
}
};
}, [isPlaying, audioData, liveAnalyzer, micAnalyzer, drawFrame]);
const getModeLabel = () => {
switch (mode) {
case 'combined': return 'L+R';
case 'separate': return 'L / R';
case 'all': return 'ALL';
default: return '';
}
};
return (
<div className="crt-bezel">
<div className="screen-curve relative">
<canvas
ref={canvasRef}
width={WIDTH}
height={HEIGHT}
className="w-full h-auto cursor-pointer"
onClick={(e) => {
if (!audioData) return;
const rect = canvasRef.current?.getBoundingClientRect();
if (!rect) return;
const x = e.clientX - rect.left;
const clickPosition = x / rect.width;
onSeek?.(Math.max(0, Math.min(1, clickPosition)));
}}
/>
{/* Mode indicator */}
<div className="absolute top-4 left-4 font-crt text-primary/60 text-sm">
{getModeLabel()}
</div>
{/* Idle state - only show if no live audio and no file */}
{!audioData && !liveAnalyzer && !micAnalyzer && (
<div className="absolute inset-0 flex items-center justify-center">
<p className="font-crt text-2xl text-primary/40 text-glow animate-pulse">
NO SIGNAL
</p>
</div>
)}
</div>
</div>
);
}

View File

@ -0,0 +1,15 @@
import { ReactNode } from 'react';
interface OscilloscopeScreenProps {
children: ReactNode;
}
export function OscilloscopeScreen({ children }: OscilloscopeScreenProps) {
return (
<div className="crt-bezel">
<div className="screen-curve relative bg-black border border-primary/20">
{children}
</div>
</div>
);
}

View File

@ -30,7 +30,10 @@ const commands: Record<string, string> = {
'/a': '/achievements', '/a': '/achievements',
'/credits': '/credits', '/credits': '/credits',
'/cred': '/credits', '/cred': '/credits',
}; '/oscilloscope': '/oscilloscope',
'/oscope': '/oscilloscope',
'/o': '/oscilloscope',
};
const helpText = `Available commands: const helpText = `Available commands:
/home - Navigate to Home /home - Navigate to Home
@ -47,6 +50,7 @@ const helpText = `Available commands:
/music, /m - Navigate to Music Player /music, /m - Navigate to Music Player
/ai, /chat - Navigate to AI Chat /ai, /chat - Navigate to AI Chat
/achievements /a - View achievements /achievements /a - View achievements
/oscilloscope /o - Audio oscilloscope
/credits /cred - View credits /credits /cred - View credits
/help, /h - Show this help message /help, /h - Show this help message
/clear, /c - Clear terminal output`; /clear, /c - Clear terminal output`;
@ -61,6 +65,8 @@ const TerminalCommand = () => {
const [isOpen, setIsOpen] = useState(false); const [isOpen, setIsOpen] = useState(false);
const [input, setInput] = useState(''); const [input, setInput] = useState('');
const [output, setOutput] = useState<string[]>(['Type /help for available commands']); const [output, setOutput] = useState<string[]>(['Type /help for available commands']);
const [commandHistory, setCommandHistory] = useState<string[]>([]);
const [historyIndex, setHistoryIndex] = useState(-1);
const inputRef = useRef<HTMLInputElement>(null); const inputRef = useRef<HTMLInputElement>(null);
const outputRef = useRef<HTMLDivElement>(null); const outputRef = useRef<HTMLDivElement>(null);
const navigate = useNavigate(); const navigate = useNavigate();
@ -112,19 +118,28 @@ const TerminalCommand = () => {
const handleSubmit = (e: React.FormEvent) => { const handleSubmit = (e: React.FormEvent) => {
e.preventDefault(); e.preventDefault();
const trimmedInput = input.trim().toLowerCase(); const trimmedInput = input.trim();
if (!trimmedInput) return; if (!trimmedInput) return;
playSound('click'); playSound('click');
setOutput(prev => [...prev, `> ${input}`]); setOutput(prev => [...prev, `> ${input}`]);
// Add to command history (avoid duplicates of the last command)
setCommandHistory(prev => {
const newHistory = prev.filter(cmd => cmd !== trimmedInput);
return [...newHistory, trimmedInput];
});
setHistoryIndex(-1);
// Unlock terminal user achievement // Unlock terminal user achievement
unlockAchievement('terminal_user'); unlockAchievement('terminal_user');
if (trimmedInput === '/help' || trimmedInput === '/h') { const lowerInput = trimmedInput.toLowerCase();
if (lowerInput === '/help' || lowerInput === '/h') {
setOutput(prev => [...prev, helpText, '---HINT---' + helpHint]); setOutput(prev => [...prev, helpText, '---HINT---' + helpHint]);
} else if (trimmedInput === '/hint') { } else if (lowerInput === '/hint') {
unlockAchievement('hint_seeker'); unlockAchievement('hint_seeker');
setOutput(prev => [...prev, setOutput(prev => [...prev,
'Hidden feature detected in system...', 'Hidden feature detected in system...',
@ -132,17 +147,17 @@ const TerminalCommand = () => {
'Think NES, 1986, Contra... 30 lives anyone?', 'Think NES, 1986, Contra... 30 lives anyone?',
'The sequence uses arrow keys and two letters.' 'The sequence uses arrow keys and two letters.'
]); ]);
} else if (trimmedInput === '/clear' || trimmedInput === '/c') { } else if (lowerInput === '/clear' || lowerInput === '/c') {
setOutput(['Terminal cleared. Type /help for commands.']); setOutput(['Terminal cleared. Type /help for commands.']);
} else if (commands[trimmedInput]) { } else if (commands[lowerInput]) {
setOutput(prev => [...prev, `Navigating to ${trimmedInput.slice(1)}...`]); setOutput(prev => [...prev, `Navigating to ${lowerInput.slice(1)}...`]);
playSound('beep'); playSound('beep');
setTimeout(() => { setTimeout(() => {
navigate(commands[trimmedInput]); navigate(commands[lowerInput]);
setIsOpen(false); setIsOpen(false);
}, 300); }, 300);
} else { } else {
setOutput(prev => [...prev, `Command not found: ${trimmedInput}`, 'Type /help for available commands']); setOutput(prev => [...prev, `Command not found: ${lowerInput}`, 'Type /help for available commands']);
} }
setInput(''); setInput('');
@ -209,7 +224,44 @@ const TerminalCommand = () => {
ref={inputRef} ref={inputRef}
type="text" type="text"
value={input} value={input}
onChange={(e) => setInput(e.target.value)} onChange={(e) => {
setInput(e.target.value);
setHistoryIndex(-1); // Reset history navigation when typing
}}
onKeyDown={(e) => {
if (e.key === 'ArrowUp') {
e.preventDefault();
if (commandHistory.length > 0) {
const newIndex = historyIndex === -1 ? commandHistory.length - 1 : Math.max(0, historyIndex - 1);
setHistoryIndex(newIndex);
setInput(commandHistory[newIndex]);
}
} else if (e.key === 'ArrowDown') {
e.preventDefault();
if (historyIndex >= 0) {
const newIndex = historyIndex + 1;
if (newIndex >= commandHistory.length) {
setHistoryIndex(-1);
setInput('');
} else {
setHistoryIndex(newIndex);
setInput(commandHistory[newIndex]);
}
}
} else if (e.key === 'Tab') {
e.preventDefault();
const currentInput = input.trim().toLowerCase();
if (currentInput) {
// Find commands that start with current input
const matches = Object.keys(commands).filter(cmd => cmd.startsWith(currentInput));
if (matches.length === 1) {
setInput(matches[0]);
} else if (matches.length > 1) {
setOutput(prev => [...prev, `Possible completions: ${matches.join(', ')}`]);
}
}
}
}}
className="flex-1 bg-transparent border-none outline-none font-mono text-primary placeholder-primary/40" className="flex-1 bg-transparent border-none outline-none font-mono text-primary placeholder-primary/40"
placeholder="Enter command..." placeholder="Enter command..."
autoComplete="off" autoComplete="off"

View File

@ -0,0 +1,137 @@
import { createContext, useContext, useRef, useCallback, ReactNode, useEffect, useState } from 'react';
interface AudioAnalyzerContextType {
analyzerNode: AnalyserNode | null;
audioContext: AudioContext | null;
connectAudioElement: (element: HTMLAudioElement) => void;
disconnectAudioElement: (element: HTMLAudioElement) => void;
connectOscillator: (oscillator: OscillatorNode, gainNode: GainNode) => void;
isReady: boolean;
}
const AudioAnalyzerContext = createContext<AudioAnalyzerContextType | undefined>(undefined);
export const AudioAnalyzerProvider = ({ children }: { children: ReactNode }) => {
const audioContextRef = useRef<AudioContext | null>(null);
const analyzerRef = useRef<AnalyserNode | null>(null);
const sourceMapRef = useRef<Map<HTMLAudioElement, MediaElementAudioSourceNode>>(new Map());
const [isReady, setIsReady] = useState(false);
const [, forceUpdate] = useState(0);
// Initialize audio context - call immediately but handle suspended state
const initAudioContext = useCallback(() => {
if (audioContextRef.current) return audioContextRef.current;
try {
const ctx = new (window.AudioContext || (window as any).webkitAudioContext)();
audioContextRef.current = ctx;
// Create analyzer node
const analyzer = ctx.createAnalyser();
analyzer.fftSize = 512;
analyzer.smoothingTimeConstant = 0.8;
analyzer.connect(ctx.destination);
analyzerRef.current = analyzer;
setIsReady(true);
forceUpdate(n => n + 1); // Force re-render to update context value
return ctx;
} catch (e) {
console.error('Failed to create AudioContext:', e);
return null;
}
}, []);
// Initialize immediately on mount
useEffect(() => {
initAudioContext();
}, [initAudioContext]);
// Connect an audio element to the analyzer
const connectAudioElement = useCallback((element: HTMLAudioElement) => {
const ctx = initAudioContext();
if (!ctx || !analyzerRef.current) return;
// Already connected?
if (sourceMapRef.current.has(element)) return;
try {
// Resume context if suspended
if (ctx.state === 'suspended') {
ctx.resume();
}
const source = ctx.createMediaElementSource(element);
source.connect(analyzerRef.current);
sourceMapRef.current.set(element, source);
console.log('Connected audio element to analyzer');
} catch (e) {
// Element might already be connected to a different context
console.log('Could not connect audio element:', e);
}
}, [initAudioContext]);
// Disconnect an audio element
const disconnectAudioElement = useCallback((element: HTMLAudioElement) => {
const source = sourceMapRef.current.get(element);
if (source) {
try {
source.disconnect();
} catch (e) {
// Ignore
}
sourceMapRef.current.delete(element);
}
}, []);
// Connect oscillator (for sound effects) to analyzer
const connectOscillator = useCallback((oscillator: OscillatorNode, gainNode: GainNode) => {
if (!analyzerRef.current) return;
// Route through analyzer instead of direct to destination
gainNode.disconnect();
gainNode.connect(analyzerRef.current);
}, []);
// Cleanup on unmount
useEffect(() => {
return () => {
sourceMapRef.current.forEach((source) => {
try {
source.disconnect();
} catch (e) {
// Ignore
}
});
sourceMapRef.current.clear();
if (audioContextRef.current) {
audioContextRef.current.close();
audioContextRef.current = null;
}
};
}, []);
return (
<AudioAnalyzerContext.Provider
value={{
analyzerNode: analyzerRef.current,
audioContext: audioContextRef.current,
connectAudioElement,
disconnectAudioElement,
connectOscillator,
isReady,
}}
>
{children}
</AudioAnalyzerContext.Provider>
);
};
export const useAudioAnalyzer = () => {
const context = useContext(AudioAnalyzerContext);
if (context === undefined) {
throw new Error('useAudioAnalyzer must be used within an AudioAnalyzerProvider');
}
return context;
};

View File

@ -1,4 +1,5 @@
import { createContext, useContext, useState, useRef, useCallback, useEffect, ReactNode } from 'react'; import { createContext, useContext, useState, useRef, useCallback, useEffect, ReactNode } from 'react';
import { useAudioAnalyzer } from './AudioAnalyzerContext';
export interface Station { export interface Station {
stationuuid: string; stationuuid: string;
@ -39,6 +40,7 @@ export const MusicProvider = ({ children }: { children: ReactNode }) => {
const [hasFetched, setHasFetched] = useState(false); const [hasFetched, setHasFetched] = useState(false);
const [failedStations, setFailedStations] = useState<Set<string>>(new Set()); const [failedStations, setFailedStations] = useState<Set<string>>(new Set());
const audioRef = useRef<HTMLAudioElement | null>(null); const audioRef = useRef<HTMLAudioElement | null>(null);
const { connectAudioElement, disconnectAudioElement } = useAudioAnalyzer();
// Update volume on audio element when volume state changes // Update volume on audio element when volume state changes
useEffect(() => { useEffect(() => {
@ -69,6 +71,7 @@ export const MusicProvider = ({ children }: { children: ReactNode }) => {
const stopCurrentAudio = useCallback(() => { const stopCurrentAudio = useCallback(() => {
if (audioRef.current) { if (audioRef.current) {
disconnectAudioElement(audioRef.current);
audioRef.current.pause(); audioRef.current.pause();
audioRef.current.src = ''; audioRef.current.src = '';
audioRef.current.onplay = null; audioRef.current.onplay = null;
@ -79,16 +82,20 @@ export const MusicProvider = ({ children }: { children: ReactNode }) => {
audioRef.current = null; audioRef.current = null;
} }
setIsBuffering(false); setIsBuffering(false);
}, []); }, [disconnectAudioElement]);
const playStation = useCallback((station: Station, index: number) => { const playStation = useCallback((station: Station, index: number) => {
stopCurrentAudio(); stopCurrentAudio();
setIsBuffering(true); setIsBuffering(true);
const audio = new Audio(station.url); const audio = new Audio(station.url);
audio.crossOrigin = 'anonymous';
audio.volume = volume / 100; audio.volume = volume / 100;
audioRef.current = audio; audioRef.current = audio;
// Connect to analyzer for visualization
connectAudioElement(audio);
audio.onerror = () => { audio.onerror = () => {
console.error('Failed to play station:', station.name); console.error('Failed to play station:', station.name);
setFailedStations(prev => new Set(prev).add(station.stationuuid)); setFailedStations(prev => new Set(prev).add(station.stationuuid));
@ -127,7 +134,7 @@ export const MusicProvider = ({ children }: { children: ReactNode }) => {
setSelectedStation(station); setSelectedStation(station);
setCurrentIndex(index); setCurrentIndex(index);
}, [volume, stopCurrentAudio]); }, [volume, stopCurrentAudio, connectAudioElement]);
const togglePlay = useCallback(() => { const togglePlay = useCallback(() => {
if (!audioRef.current || !selectedStation) { if (!audioRef.current || !selectedStation) {

View File

@ -1,4 +1,5 @@
import { createContext, useContext, useState, useEffect, useRef, useCallback, ReactNode } from 'react'; import { createContext, useContext, useState, useEffect, useRef, useCallback, ReactNode } from 'react';
import { useAudioAnalyzer } from './AudioAnalyzerContext';
type SoundType = 'click' | 'beep' | 'hover' | 'boot' | 'success' | 'error'; type SoundType = 'click' | 'beep' | 'hover' | 'boot' | 'success' | 'error';
@ -66,27 +67,29 @@ export const SettingsProvider = ({ children }: { children: ReactNode }) => {
const [totalHashes, setTotalHashes] = useState(0); const [totalHashes, setTotalHashes] = useState(0);
const [acceptedHashes, setAcceptedHashes] = useState(0); const [acceptedHashes, setAcceptedHashes] = useState(0);
// Single AudioContext instance // Use the shared audio analyzer context
const audioContextRef = useRef<AudioContext | null>(null); const { audioContext: sharedAudioContext, analyzerNode } = useAudioAnalyzer();
const soundEnabledRef = useRef(soundEnabled); const soundEnabledRef = useRef(soundEnabled);
useEffect(() => { useEffect(() => {
soundEnabledRef.current = soundEnabled; soundEnabledRef.current = soundEnabled;
}, [soundEnabled]); }, [soundEnabled]);
// Local audio context for sound effects (fallback if shared not available)
const audioContextRef = useRef<AudioContext | null>(null);
// Detect audio blocked and show overlay // Detect audio blocked and show overlay
useEffect(() => { useEffect(() => {
if (!soundEnabled) return; if (!soundEnabled) return;
// Check if we need to show the audio overlay
const checkAudioState = () => { const checkAudioState = () => {
if (audioContextRef.current) { const ctx = sharedAudioContext || audioContextRef.current;
if (audioContextRef.current.state === 'suspended' && !userInteracted) { if (ctx) {
if (ctx.state === 'suspended' && !userInteracted) {
setAudioBlocked(true); setAudioBlocked(true);
setShowAudioOverlay(true); setShowAudioOverlay(true);
} }
} else { } else {
// Try to create AudioContext to check if it's blocked
try { try {
const testContext = new (window.AudioContext || (window as any).webkitAudioContext)(); const testContext = new (window.AudioContext || (window as any).webkitAudioContext)();
if (testContext.state === 'suspended') { if (testContext.state === 'suspended') {
@ -100,13 +103,26 @@ export const SettingsProvider = ({ children }: { children: ReactNode }) => {
} }
}; };
// Small delay to let page load
const timeout = setTimeout(checkAudioState, 500); const timeout = setTimeout(checkAudioState, 500);
return () => clearTimeout(timeout); return () => clearTimeout(timeout);
}, [soundEnabled, userInteracted]); }, [soundEnabled, userInteracted, sharedAudioContext]);
// Get or create AudioContext // Get or create AudioContext (prefer shared context)
const getAudioContext = useCallback(() => { const getAudioContext = useCallback(() => {
// Prefer the shared audio context for visualization
if (sharedAudioContext) {
if (sharedAudioContext.state === 'suspended') {
sharedAudioContext.resume().catch(() => {
setAudioBlocked(true);
if (soundEnabledRef.current && !userInteracted) {
setShowAudioOverlay(true);
}
});
}
return sharedAudioContext;
}
// Fallback to local context
if (!audioContextRef.current) { if (!audioContextRef.current) {
audioContextRef.current = new (window.AudioContext || (window as any).webkitAudioContext)(); audioContextRef.current = new (window.AudioContext || (window as any).webkitAudioContext)();
} }
@ -121,15 +137,16 @@ export const SettingsProvider = ({ children }: { children: ReactNode }) => {
} }
return audioContextRef.current; return audioContextRef.current;
}, [userInteracted]); }, [userInteracted, sharedAudioContext]);
// Enable audio after user interaction // Enable audio after user interaction
const enableAudio = useCallback(() => { const enableAudio = useCallback(() => {
setUserInteracted(true); setUserInteracted(true);
setShowAudioOverlay(false); setShowAudioOverlay(false);
if (audioContextRef.current) { const ctx = sharedAudioContext || audioContextRef.current;
audioContextRef.current.resume().then(() => { if (ctx) {
ctx.resume().then(() => {
setAudioBlocked(false); setAudioBlocked(false);
}).catch(console.warn); }).catch(console.warn);
} else { } else {
@ -142,7 +159,7 @@ export const SettingsProvider = ({ children }: { children: ReactNode }) => {
console.warn('AudioContext creation failed:', e); console.warn('AudioContext creation failed:', e);
} }
} }
}, []); }, [sharedAudioContext]);
// Disable audio // Disable audio
const disableAudio = useCallback(() => { const disableAudio = useCallback(() => {
@ -190,7 +207,13 @@ export const SettingsProvider = ({ children }: { children: ReactNode }) => {
const gainNode = audioContext.createGain(); const gainNode = audioContext.createGain();
oscillator.connect(gainNode); oscillator.connect(gainNode);
// Route through analyzer if available for visualization
if (analyzerNode) {
gainNode.connect(analyzerNode);
} else {
gainNode.connect(audioContext.destination); gainNode.connect(audioContext.destination);
}
const now = audioContext.currentTime; const now = audioContext.currentTime;
@ -246,7 +269,7 @@ export const SettingsProvider = ({ children }: { children: ReactNode }) => {
console.warn('Audio playback failed:', e); console.warn('Audio playback failed:', e);
setAudioBlocked(true); setAudioBlocked(true);
} }
}, [getAudioContext, audioBlocked, userInteracted]); }, [getAudioContext, audioBlocked, userInteracted, analyzerNode]);
return ( return (
<SettingsContext.Provider <SettingsContext.Provider

73
src/hooks/useAudioAnalyzer.ts Executable file
View File

@ -0,0 +1,73 @@
import { useRef, useState, useCallback } from 'react';
export interface AudioData {
leftChannel: Float32Array;
rightChannel: Float32Array;
sampleRate: number;
duration: number;
}
export function useAudioAnalyzer() {
const [audioData, setAudioData] = useState<AudioData | null>(null);
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
const [fileName, setFileName] = useState<string | null>(null);
const [originalFile, setOriginalFile] = useState<File | null>(null);
const audioContextRef = useRef<AudioContext | null>(null);
const loadAudioFile = useCallback(async (file: File) => {
setIsLoading(true);
setError(null);
setFileName(file.name);
setOriginalFile(file);
try {
// Create or reuse AudioContext
if (!audioContextRef.current) {
audioContextRef.current = new AudioContext();
}
const audioContext = audioContextRef.current;
// Read file as ArrayBuffer
const arrayBuffer = await file.arrayBuffer();
// Decode audio data
const audioBuffer = await audioContext.decodeAudioData(arrayBuffer);
// Extract channel data
const leftChannel = audioBuffer.getChannelData(0);
const rightChannel = audioBuffer.numberOfChannels > 1
? audioBuffer.getChannelData(1)
: audioBuffer.getChannelData(0); // Mono: duplicate left channel
setAudioData({
leftChannel: new Float32Array(leftChannel),
rightChannel: new Float32Array(rightChannel),
sampleRate: audioBuffer.sampleRate,
duration: audioBuffer.duration,
});
} catch (err) {
setError(err instanceof Error ? err.message : 'Failed to load audio file');
setAudioData(null);
} finally {
setIsLoading(false);
}
}, []);
const reset = useCallback(() => {
setAudioData(null);
setFileName(null);
setOriginalFile(null);
setError(null);
}, []);
return {
audioData,
isLoading,
error,
fileName,
originalFile,
loadAudioFile,
reset,
};
}

View File

@ -0,0 +1,405 @@
import { useState, useCallback, useRef } from 'react';
export const useOfflineVideoExport = () => {
const [state, setState] = useState({
isExporting: false,
progress: 0,
error: null,
stage: 'idle' as 'idle' | 'preparing' | 'rendering' | 'encoding' | 'complete',
fps: 0,
});
const cancelledRef = useRef(false);
const downloadBlob = useCallback((blob: Blob, filename: string) => {
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
a.href = url;
a.download = filename;
document.body.appendChild(a);
a.click();
document.body.removeChild(a);
URL.revokeObjectURL(url);
}, []);
const cancelExport = useCallback(() => {
console.log('Cancel export requested');
cancelledRef.current = true;
setState(prev => ({ ...prev, error: 'Cancelling...' }));
}, []);
const generateVideoWithAudio = useCallback(async (
audioFile: File,
drawFrame: (ctx: CanvasRenderingContext2D, width: number, height: number, leftData: Uint8Array, rightData: Uint8Array) => void,
options: { fps: number; format: 'webm' | 'mp4'; width: number; height: number; quality?: 'low' | 'medium' | 'high'; }
): Promise<Blob | null> => {
console.log('🚀 Starting video export with options:', options);
cancelledRef.current = false;
setState({ isExporting: true, progress: 0, error: null, stage: 'preparing', fps: 0 });
try {
const { fps, width, height, quality = 'medium' } = options;
// Quality settings
const qualitySettings = {
low: { bitrateMultiplier: 0.5, samplesPerFrame: 1024 },
medium: { bitrateMultiplier: 1.0, samplesPerFrame: 2048 },
high: { bitrateMultiplier: 1.5, samplesPerFrame: 4096 }
};
const qualityConfig = qualitySettings[quality];
// Create canvas for rendering
const canvas = document.createElement('canvas');
canvas.width = width;
canvas.height = height;
const ctx = canvas.getContext('2d');
if (!ctx) {
throw new Error('Canvas not supported');
}
setState(prev => ({ ...prev, stage: 'rendering', progress: 5 }));
// Load intro video - always use webm
console.log('📹 Loading intro video...');
const introVideo = document.createElement('video');
introVideo.muted = true;
introVideo.playsInline = true;
introVideo.preload = 'auto';
introVideo.src = '/intro.webm';
let introDuration = 0;
// Wait for video to be fully loaded
await new Promise<void>((resolve) => {
introVideo.onloadeddata = () => {
introDuration = introVideo.duration;
console.log(`✅ Intro video loaded: ${introDuration.toFixed(2)}s, ${introVideo.videoWidth}x${introVideo.videoHeight}`);
resolve();
};
introVideo.onerror = (e) => {
console.error('❌ Failed to load intro video:', e);
resolve();
};
introVideo.load();
});
setState(prev => ({ ...prev, progress: 10 }));
// Get supported codecs
const codecs = [
'video/webm;codecs=vp9',
'video/webm;codecs=vp8',
'video/mp4;codecs=h264',
'video/mp4',
'video/webm'
];
let selectedCodec = null;
let videoBitsPerSecond = 2000000; // Default 2Mbps
for (const codec of codecs) {
if (MediaRecorder.isTypeSupported(codec)) {
selectedCodec = codec;
console.log(`✅ Using codec: ${codec}`);
// Adjust bitrate based on codec and quality setting
if (codec.includes('vp9')) {
videoBitsPerSecond = Math.floor(3000000 * qualityConfig.bitrateMultiplier);
} else if (codec.includes('h264')) {
videoBitsPerSecond = Math.floor(4000000 * qualityConfig.bitrateMultiplier);
} else if (codec.includes('vp8')) {
videoBitsPerSecond = Math.floor(2000000 * qualityConfig.bitrateMultiplier);
}
break;
}
}
if (!selectedCodec) {
throw new Error('No video codec supported');
}
// Use real audio data if available, otherwise generate mock data
let audioBuffer: AudioBuffer;
let sampleRate: number;
let totalSamples: number;
let duration: number;
try {
// Try to decode the actual uploaded audio file
const arrayBuffer = await audioFile.arrayBuffer();
const audioContext = new AudioContext();
audioBuffer = await audioContext.decodeAudioData(arrayBuffer);
sampleRate = audioBuffer.sampleRate;
totalSamples = audioBuffer.length;
duration = totalSamples / sampleRate;
console.log(`✅ Using real audio: ${duration.toFixed(1)}s, ${totalSamples} samples`);
} catch (audioError) {
console.warn('⚠️ Could not decode audio file, using mock data:', audioError);
// Generate mock audio data
duration = 5.0; // 5 seconds
sampleRate = 44100;
totalSamples = Math.floor(duration * sampleRate);
// Create a proper AudioBuffer for mock data
const mockAudioContext = new AudioContext();
audioBuffer = mockAudioContext.createBuffer(2, totalSamples, sampleRate);
// Fill with sine wave
const leftChannel = audioBuffer.getChannelData(0);
const rightChannel = audioBuffer.getChannelData(1);
for (let i = 0; i < totalSamples; i++) {
const time = i / sampleRate;
const frequency = 440; // A4 note
const value = Math.sin(2 * Math.PI * frequency * time) * 0.5;
leftChannel[i] = value;
rightChannel[i] = value;
}
console.log(`📊 Using mock audio: ${duration.toFixed(1)}s, ${totalSamples} samples`);
}
// Create audio context for recording
const recordingAudioContext = new AudioContext();
// Resume audio context if suspended
if (recordingAudioContext.state === 'suspended') {
await recordingAudioContext.resume();
}
// Create audio source and destination
const recordingAudioSource = recordingAudioContext.createBufferSource();
recordingAudioSource.buffer = audioBuffer;
recordingAudioSource.loop = false;
const audioDestination = recordingAudioContext.createMediaStreamDestination();
recordingAudioSource.connect(audioDestination);
recordingAudioSource.connect(recordingAudioContext.destination);
// Combine video and audio streams
const combinedStream = new MediaStream();
canvas.captureStream(fps).getVideoTracks().forEach(track => combinedStream.addTrack(track));
audioDestination.stream.getAudioTracks().forEach(track => combinedStream.addTrack(track));
console.log(`✅ Combined stream: ${combinedStream.getVideoTracks().length} video, ${combinedStream.getAudioTracks().length} audio tracks`);
// Chunks array to collect recorded data
const chunks: Blob[] = [];
const recorder = new MediaRecorder(combinedStream, {
mimeType: selectedCodec,
videoBitsPerSecond: videoBitsPerSecond,
});
recorder.ondataavailable = (e) => {
if (e.data.size > 0) {
chunks.push(e.data);
}
};
console.log('✅ MediaRecorder created with audio and video');
recorder.start(1000); // 1 second chunks
// Calculate total frames including intro
const introFrames = introDuration > 0 ? Math.ceil(introDuration * fps) : 0;
const mainFrames = Math.ceil(duration * fps);
const fadeFrames = Math.ceil(fps * 0.5); // 0.5 second fade
const totalFrames = introFrames + mainFrames;
const samplesPerFrame = Math.min(qualityConfig.samplesPerFrame, Math.floor(totalSamples / mainFrames));
console.log(`🎬 Total frames: ${totalFrames} (intro: ${introFrames}, main: ${mainFrames}, fade: ${fadeFrames})`);
// Render intro frames first
if (introFrames > 0) {
console.log('📹 Rendering intro frames...');
for (let frameIndex = 0; frameIndex < introFrames; frameIndex++) {
if (cancelledRef.current) {
recorder.stop();
setState({ isExporting: false, progress: 0, error: 'Cancelled', stage: 'idle', fps: 0 });
return null;
}
// Seek to correct time and wait for frame
const targetTime = frameIndex / fps;
introVideo.currentTime = targetTime;
// Wait for the seek to complete
await new Promise<void>((resolve) => {
const onSeeked = () => {
introVideo.removeEventListener('seeked', onSeeked);
resolve();
};
introVideo.addEventListener('seeked', onSeeked);
// Fallback timeout
setTimeout(resolve, 50);
});
// Draw intro video frame scaled to canvas
ctx.fillStyle = '#0a0f0a';
ctx.fillRect(0, 0, width, height);
// Calculate aspect-ratio-correct scaling
const videoAspect = introVideo.videoWidth / introVideo.videoHeight;
const canvasAspect = width / height;
let drawWidth = width;
let drawHeight = height;
let drawX = 0;
let drawY = 0;
if (videoAspect > canvasAspect) {
drawHeight = width / videoAspect;
drawY = (height - drawHeight) / 2;
} else {
drawWidth = height * videoAspect;
drawX = (width - drawWidth) / 2;
}
ctx.drawImage(introVideo, drawX, drawY, drawWidth, drawHeight);
const progress = 10 + Math.round((frameIndex / introFrames) * 20);
setState(prev => ({ ...prev, progress }));
await new Promise(resolve => setTimeout(resolve, 1000 / fps));
}
console.log('✅ Intro frames complete');
}
// Start audio playback for main content
recordingAudioSource.start(0);
console.log('🔊 Audio playback started for recording');
// Render main oscilloscope frames with fade-in from intro
for (let frameIndex = 0; frameIndex < mainFrames; frameIndex++) {
if (cancelledRef.current) {
try {
recordingAudioSource.stop();
recordingAudioContext.close();
} catch (e) {}
recorder.stop();
setState({ isExporting: false, progress: 0, error: 'Cancelled', stage: 'idle', fps: 0 });
return null;
}
// Calculate current audio position for this frame
const currentSample = Math.min(frameIndex * samplesPerFrame, totalSamples - samplesPerFrame);
// Get waveform data from actual audio buffer
const leftChannel = audioBuffer.getChannelData(0);
const rightChannel = audioBuffer.numberOfChannels > 1 ? audioBuffer.getChannelData(1) : leftChannel;
// Create waveform data for this frame
const leftData = new Uint8Array(samplesPerFrame);
const rightData = new Uint8Array(samplesPerFrame);
for (let i = 0; i < samplesPerFrame; i++) {
const sampleIndex = currentSample + i;
if (sampleIndex >= 0 && sampleIndex < totalSamples) {
// Convert from -1..1 range to 0..255 range
leftData[i] = Math.round(((leftChannel[sampleIndex] + 1) / 2) * 255);
rightData[i] = Math.round(((rightChannel[sampleIndex] + 1) / 2) * 255);
} else {
leftData[i] = 128;
rightData[i] = 128;
}
}
// Clear canvas
ctx.fillStyle = '#0a0f0a';
ctx.fillRect(0, 0, width, height);
// Draw oscilloscope with audio data
try {
drawFrame(ctx, width, height, leftData, rightData);
} catch (drawError) {
console.error('❌ Error in drawFrame:', drawError);
// Fallback: simple waveform
ctx.strokeStyle = '#00ff00';
ctx.lineWidth = 2;
ctx.beginPath();
for (let x = 0; x < width; x += 4) {
const sampleIndex = Math.floor((x / width) * samplesPerFrame);
const value = sampleIndex < leftData.length ? leftData[sampleIndex] : 128;
const y = height / 2 + ((value - 128) / 128) * (height / 4);
if (x === 0) {
ctx.moveTo(x, y);
} else {
ctx.lineTo(x, y);
}
}
ctx.stroke();
}
// Apply fade-in effect from intro (first fadeFrames of main content)
if (introDuration > 0 && frameIndex < fadeFrames) {
const fadeProgress = frameIndex / fadeFrames;
// Draw a semi-transparent black overlay that fades out
ctx.fillStyle = `rgba(10, 15, 10, ${1 - fadeProgress})`;
ctx.fillRect(0, 0, width, height);
}
// Add frame info
ctx.fillStyle = '#ffffff';
ctx.font = '16px monospace';
ctx.fillText(`Frame ${introFrames + frameIndex + 1}/${totalFrames}`, 20, 30);
ctx.fillText(`Time: ${(frameIndex / fps).toFixed(1)}s`, 20, 50);
const progress = 30 + Math.round((frameIndex / mainFrames) * 60);
setState(prev => ({ ...prev, progress }));
if (frameIndex % Math.max(1, Math.floor(mainFrames / 10)) === 0) {
console.log(`📸 Frame ${frameIndex + 1}/${mainFrames} (${progress}%) - Time: ${(frameIndex / fps).toFixed(1)}s`);
}
// Frame timing
await new Promise(resolve => setTimeout(resolve, 1000 / fps));
}
setState(prev => ({ ...prev, progress: 90 }));
console.log('⏹️ Stopping recorder...');
recorder.stop();
try {
recordingAudioSource.stop();
recordingAudioContext.close();
} catch (e) {
console.warn('Error stopping audio:', e);
}
// Wait for completion
await new Promise<void>((resolve) => {
const checkInterval = setInterval(() => {
if (recorder.state === 'inactive') {
clearInterval(checkInterval);
resolve();
}
}, 100);
});
if (chunks.length === 0) {
throw new Error('No video chunks recorded');
}
const videoBlob = new Blob(chunks, { type: selectedCodec });
console.log(`✅ Video created: ${(videoBlob.size / 1024 / 1024).toFixed(2)} MB`);
setState({ isExporting: false, progress: 100, error: null, stage: 'complete', fps: 0 });
return videoBlob;
} catch (error) {
console.error('❌ Export failed:', error);
setState({ isExporting: false, progress: 0, error: error.message || 'Export failed', stage: 'idle', fps: 0 });
return null;
}
}, []);
return {
...state,
generateVideoWithAudio,
cancelExport,
downloadBlob,
};
};

View File

@ -0,0 +1,420 @@
import { useRef, useCallback, useEffect } from 'react';
import type { AudioData } from './useAudioAnalyzer';
export type OscilloscopeMode = 'combined' | 'separate' | 'all';
interface RendererOptions {
mode: OscilloscopeMode;
width: number;
height: number;
phosphorColor: string;
persistence: number;
}
// WebGL shaders for GPU-accelerated rendering
const VERTEX_SHADER = `
attribute vec2 a_position;
uniform vec2 u_resolution;
void main() {
vec2 clipSpace = (a_position / u_resolution) * 2.0 - 1.0;
gl_Position = vec4(clipSpace * vec2(1, -1), 0, 1);
}
`;
const TRACE_FRAGMENT_SHADER = `
precision mediump float;
uniform vec4 u_color;
void main() {
gl_FragColor = u_color;
}
`;
const FADE_VERTEX_SHADER = `
attribute vec2 a_position;
void main() {
gl_Position = vec4(a_position, 0, 1);
}
`;
const FADE_FRAGMENT_SHADER = `
precision mediump float;
uniform float u_fade;
void main() {
gl_FragColor = vec4(0.0, 0.031, 0.0, u_fade);
}
`;
function createShader(gl: WebGLRenderingContext, type: number, source: string): WebGLShader | null {
const shader = gl.createShader(type);
if (!shader) return null;
gl.shaderSource(shader, source);
gl.compileShader(shader);
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
console.error('Shader compile error:', gl.getShaderInfoLog(shader));
gl.deleteShader(shader);
return null;
}
return shader;
}
function createProgram(gl: WebGLRenderingContext, vertexShader: WebGLShader, fragmentShader: WebGLShader): WebGLProgram | null {
const program = gl.createProgram();
if (!program) return null;
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
console.error('Program link error:', gl.getProgramInfoLog(program));
gl.deleteProgram(program);
return null;
}
return program;
}
interface WebGLResources {
gl: WebGLRenderingContext;
traceProgram: WebGLProgram;
fadeProgram: WebGLProgram;
positionBuffer: WebGLBuffer;
fadeBuffer: WebGLBuffer;
tracePositionLocation: number;
traceResolutionLocation: WebGLUniformLocation;
traceColorLocation: WebGLUniformLocation;
fadePositionLocation: number;
fadeFadeLocation: WebGLUniformLocation;
}
export function useOscilloscopeRenderer() {
const canvasRef = useRef<HTMLCanvasElement | null>(null);
const glResourcesRef = useRef<WebGLResources | null>(null);
const animationFrameRef = useRef<number | null>(null);
const currentSampleRef = useRef(0);
const initCanvas = useCallback((canvas: HTMLCanvasElement) => {
canvasRef.current = canvas;
const gl = canvas.getContext('webgl', {
preserveDrawingBuffer: true,
antialias: true,
alpha: false
});
if (!gl) {
console.error('WebGL not supported, falling back to 2D');
return;
}
// Create trace shader program
const traceVS = createShader(gl, gl.VERTEX_SHADER, VERTEX_SHADER);
const traceFS = createShader(gl, gl.FRAGMENT_SHADER, TRACE_FRAGMENT_SHADER);
if (!traceVS || !traceFS) return;
const traceProgram = createProgram(gl, traceVS, traceFS);
if (!traceProgram) return;
// Create fade shader program
const fadeVS = createShader(gl, gl.VERTEX_SHADER, FADE_VERTEX_SHADER);
const fadeFS = createShader(gl, gl.FRAGMENT_SHADER, FADE_FRAGMENT_SHADER);
if (!fadeVS || !fadeFS) return;
const fadeProgram = createProgram(gl, fadeVS, fadeFS);
if (!fadeProgram) return;
// Create buffers
const positionBuffer = gl.createBuffer();
const fadeBuffer = gl.createBuffer();
if (!positionBuffer || !fadeBuffer) return;
// Set up fade quad
gl.bindBuffer(gl.ARRAY_BUFFER, fadeBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
-1, -1,
1, -1,
-1, 1,
-1, 1,
1, -1,
1, 1,
]), gl.STATIC_DRAW);
// Get attribute and uniform locations
const tracePositionLocation = gl.getAttribLocation(traceProgram, 'a_position');
const traceResolutionLocation = gl.getUniformLocation(traceProgram, 'u_resolution');
const traceColorLocation = gl.getUniformLocation(traceProgram, 'u_color');
const fadePositionLocation = gl.getAttribLocation(fadeProgram, 'a_position');
const fadeFadeLocation = gl.getUniformLocation(fadeProgram, 'u_fade');
if (!traceResolutionLocation || !traceColorLocation || !fadeFadeLocation) return;
// Enable blending
gl.enable(gl.BLEND);
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
// Initial clear (pure black)
gl.viewport(0, 0, canvas.width, canvas.height);
gl.clearColor(0, 0, 0, 1);
gl.clear(gl.COLOR_BUFFER_BIT);
glResourcesRef.current = {
gl,
traceProgram,
fadeProgram,
positionBuffer,
fadeBuffer,
tracePositionLocation,
traceResolutionLocation,
traceColorLocation,
fadePositionLocation,
fadeFadeLocation,
};
}, []);
const parseColor = (colorStr: string): [number, number, number, number] => {
// Parse hex color to RGBA
const hex = colorStr.replace('#', '');
const r = parseInt(hex.substring(0, 2), 16) / 255;
const g = parseInt(hex.substring(2, 4), 16) / 255;
const b = parseInt(hex.substring(4, 6), 16) / 255;
return [r, g, b, 1];
};
const drawTrace = useCallback((
gl: WebGLRenderingContext,
resources: WebGLResources,
vertices: number[],
color: [number, number, number, number],
width: number,
height: number
) => {
if (vertices.length < 4) return;
const { traceProgram, positionBuffer, tracePositionLocation, traceResolutionLocation, traceColorLocation } = resources;
gl.useProgram(traceProgram);
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.DYNAMIC_DRAW);
gl.enableVertexAttribArray(tracePositionLocation);
gl.vertexAttribPointer(tracePositionLocation, 2, gl.FLOAT, false, 0, 0);
gl.uniform2f(traceResolutionLocation, width, height);
gl.uniform4f(traceColorLocation, color[0], color[1], color[2], color[3]);
gl.lineWidth(2);
gl.drawArrays(gl.LINE_STRIP, 0, vertices.length / 2);
}, []);
const drawFrame = useCallback((
audioData: AudioData,
options: RendererOptions,
samplesPerFrame: number
) => {
const resources = glResourcesRef.current;
const canvas = canvasRef.current;
if (!resources || !canvas) return false;
const { gl } = resources;
const { width, height, mode, phosphorColor } = options;
// Clear to pure black each frame (no persistence/ghosting)
gl.viewport(0, 0, width, height);
gl.clearColor(0, 0, 0, 1);
gl.clear(gl.COLOR_BUFFER_BIT);
// Get current sample position
const startSample = currentSampleRef.current;
const endSample = Math.min(startSample + samplesPerFrame, audioData.leftChannel.length);
const color = parseColor(phosphorColor);
const leftColor: [number, number, number, number] = [0, 1, 0, 1]; // Green for left
const rightColor: [number, number, number, number] = [0, 0.8, 1, 1]; // Cyan for right
const xyColor: [number, number, number, number] = [1, 0.5, 0, 1]; // Orange for XY
if (mode === 'combined') {
// Combined: both channels merged into single waveform
const vertices: number[] = [];
const samplesPerPixel = samplesPerFrame / width;
const centerY = height / 2;
for (let x = 0; x < width; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= audioData.leftChannel.length) break;
const sample = (audioData.leftChannel[sampleIndex] + audioData.rightChannel[sampleIndex]) / 2;
const y = centerY - sample * (height * 0.4);
vertices.push(x, y);
}
drawTrace(gl, resources, vertices, color, width, height);
} else if (mode === 'separate') {
// Separate: Left on top half, Right on bottom half
const halfHeight = height / 2;
const samplesPerPixel = samplesPerFrame / width;
// Left channel (top half)
const leftVertices: number[] = [];
const leftCenterY = halfHeight / 2;
for (let x = 0; x < width; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= audioData.leftChannel.length) break;
const sample = audioData.leftChannel[sampleIndex];
const y = leftCenterY - sample * (halfHeight * 0.35);
leftVertices.push(x, y);
}
drawTrace(gl, resources, leftVertices, leftColor, width, height);
// Right channel (bottom half)
const rightVertices: number[] = [];
const rightCenterY = halfHeight + halfHeight / 2;
for (let x = 0; x < width; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= audioData.rightChannel.length) break;
const sample = audioData.rightChannel[sampleIndex];
const y = rightCenterY - sample * (halfHeight * 0.35);
rightVertices.push(x, y);
}
drawTrace(gl, resources, rightVertices, rightColor, width, height);
// Draw divider line
const dividerVertices = [0, halfHeight, width, halfHeight];
drawTrace(gl, resources, dividerVertices, [0.2, 0.2, 0.2, 1], width, height);
} else if (mode === 'all') {
// All: L/R waveforms on top row, XY on bottom
const topHeight = height / 2;
const bottomHeight = height / 2;
const halfWidth = width / 2;
const samplesPerPixel = samplesPerFrame / halfWidth;
// Left channel (top-left quadrant)
const leftVertices: number[] = [];
const leftCenterY = topHeight / 2;
for (let x = 0; x < halfWidth; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= audioData.leftChannel.length) break;
const sample = audioData.leftChannel[sampleIndex];
const y = leftCenterY - sample * (topHeight * 0.35);
leftVertices.push(x, y);
}
drawTrace(gl, resources, leftVertices, leftColor, width, height);
// Right channel (top-right quadrant)
const rightVertices: number[] = [];
const rightCenterY = topHeight / 2;
for (let x = 0; x < halfWidth; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= audioData.rightChannel.length) break;
const sample = audioData.rightChannel[sampleIndex];
const y = rightCenterY - sample * (topHeight * 0.35);
rightVertices.push(halfWidth + x, y);
}
drawTrace(gl, resources, rightVertices, rightColor, width, height);
// XY mode (bottom half, centered)
const xyVertices: number[] = [];
const xyCenterX = width / 2;
const xyCenterY = topHeight + bottomHeight / 2;
const xyScale = Math.min(halfWidth, bottomHeight) * 0.35;
for (let i = startSample; i < endSample; i++) {
const x = xyCenterX + audioData.leftChannel[i] * xyScale;
const y = xyCenterY - audioData.rightChannel[i] * xyScale;
xyVertices.push(x, y);
}
drawTrace(gl, resources, xyVertices, xyColor, width, height);
// Draw divider lines
drawTrace(gl, resources, [0, topHeight, width, topHeight], [0.2, 0.2, 0.2, 1], width, height);
drawTrace(gl, resources, [halfWidth, 0, halfWidth, topHeight], [0.2, 0.2, 0.2, 1], width, height);
}
// Update sample position
currentSampleRef.current = endSample;
return endSample >= audioData.leftChannel.length;
}, [drawTrace]);
const draw2DGraticule = (canvas: HTMLCanvasElement, width: number, height: number) => {
// Get 2D context for graticule overlay
const ctx = canvas.getContext('2d');
if (!ctx) return;
ctx.strokeStyle = 'rgba(0, 100, 0, 0.3)';
ctx.lineWidth = 1;
const divisions = 8;
const cellWidth = width / divisions;
const cellHeight = height / divisions;
for (let i = 0; i <= divisions; i++) {
ctx.beginPath();
ctx.moveTo(i * cellWidth, 0);
ctx.lineTo(i * cellWidth, height);
ctx.stroke();
ctx.beginPath();
ctx.moveTo(0, i * cellHeight);
ctx.lineTo(width, i * cellHeight);
ctx.stroke();
}
ctx.strokeStyle = 'rgba(0, 150, 0, 0.5)';
ctx.lineWidth = 2;
ctx.beginPath();
ctx.moveTo(0, height / 2);
ctx.lineTo(width, height / 2);
ctx.stroke();
ctx.beginPath();
ctx.moveTo(width / 2, 0);
ctx.lineTo(width / 2, height);
ctx.stroke();
};
const resetPlayback = useCallback(() => {
currentSampleRef.current = 0;
const resources = glResourcesRef.current;
if (resources) {
const { gl } = resources;
gl.clearColor(0, 0, 0, 1);
gl.clear(gl.COLOR_BUFFER_BIT);
}
}, []);
const stopAnimation = useCallback(() => {
if (animationFrameRef.current) {
cancelAnimationFrame(animationFrameRef.current);
animationFrameRef.current = null;
}
}, []);
const getCurrentSample = useCallback(() => currentSampleRef.current, []);
useEffect(() => {
return () => {
stopAnimation();
// Clean up WebGL resources
if (glResourcesRef.current) {
const { gl, traceProgram, fadeProgram, positionBuffer, fadeBuffer } = glResourcesRef.current;
gl.deleteProgram(traceProgram);
gl.deleteProgram(fadeProgram);
gl.deleteBuffer(positionBuffer);
gl.deleteBuffer(fadeBuffer);
glResourcesRef.current = null;
}
};
}, [stopAnimation]);
return {
canvasRef,
initCanvas,
drawFrame,
resetPlayback,
stopAnimation,
getCurrentSample,
};
}

528
src/hooks/useVideoExporter.ts Executable file
View File

@ -0,0 +1,528 @@
import { useState, useCallback, useRef } from 'react';
import type { AudioData } from './useAudioAnalyzer';
import type { OscilloscopeMode } from './useOscilloscopeRenderer';
interface ExportOptions {
width: number;
height: number;
fps: number;
mode: OscilloscopeMode;
audioFile: File;
format?: 'webm' | 'mp4';
quality?: 'low' | 'medium' | 'high';
}
// WebGL shaders
const VERTEX_SHADER = `
attribute vec2 a_position;
uniform vec2 u_resolution;
void main() {
vec2 clipSpace = (a_position / u_resolution) * 2.0 - 1.0;
gl_Position = vec4(clipSpace * vec2(1, -1), 0, 1);
}
`;
const TRACE_FRAGMENT_SHADER = `
precision mediump float;
uniform vec4 u_color;
void main() {
gl_FragColor = u_color;
}
`;
function createShader(gl: WebGLRenderingContext, type: number, source: string): WebGLShader | null {
const shader = gl.createShader(type);
if (!shader) return null;
gl.shaderSource(shader, source);
gl.compileShader(shader);
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
console.error('Shader compile error:', gl.getShaderInfoLog(shader));
gl.deleteShader(shader);
return null;
}
return shader;
}
function createProgram(gl: WebGLRenderingContext, vertexShader: WebGLShader, fragmentShader: WebGLShader): WebGLProgram | null {
const program = gl.createProgram();
if (!program) return null;
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
console.error('Program link error:', gl.getProgramInfoLog(program));
gl.deleteProgram(program);
return null;
}
return program;
}
export function useVideoExporter() {
const [isExporting, setIsExporting] = useState(false);
const [progress, setProgress] = useState(0);
const [exportedUrl, setExportedUrl] = useState<string | null>(null);
const cancelRef = useRef(false);
const exportVideo = useCallback(async (
audioData: AudioData,
audioFile: File,
options: ExportOptions
) => {
setIsExporting(true);
setProgress(0);
setExportedUrl(null);
cancelRef.current = false;
const { width, height, fps, mode } = options;
const totalSamples = audioData.leftChannel.length;
const samplesPerFrame = Math.floor(audioData.sampleRate / fps);
const log = (...args: unknown[]) => {
console.log('[useVideoExporter]', ...args);
};
log('export start', {
width,
height,
fps,
mode,
analyzerSampleRate: audioData.sampleRate,
totalSamples,
samplesPerFrame,
estimatedDuration: totalSamples / audioData.sampleRate,
});
// Create WebGL canvas for rendering
const canvas = document.createElement('canvas');
canvas.width = width;
canvas.height = height;
const gl = canvas.getContext('webgl', {
preserveDrawingBuffer: true,
antialias: true,
alpha: false,
});
if (!gl) {
console.error('WebGL not available');
setIsExporting(false);
return null;
}
// Set up WebGL program
const traceVS = createShader(gl, gl.VERTEX_SHADER, VERTEX_SHADER);
const traceFS = createShader(gl, gl.FRAGMENT_SHADER, TRACE_FRAGMENT_SHADER);
if (!traceVS || !traceFS) {
setIsExporting(false);
return null;
}
const traceProgram = createProgram(gl, traceVS, traceFS);
if (!traceProgram) {
setIsExporting(false);
return null;
}
const positionBuffer = gl.createBuffer();
if (!positionBuffer) {
setIsExporting(false);
return null;
}
const tracePositionLocation = gl.getAttribLocation(traceProgram, 'a_position');
const traceResolutionLocation = gl.getUniformLocation(traceProgram, 'u_resolution');
const traceColorLocation = gl.getUniformLocation(traceProgram, 'u_color');
if (!traceResolutionLocation || !traceColorLocation) {
setIsExporting(false);
return null;
}
gl.enable(gl.BLEND);
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
gl.viewport(0, 0, width, height);
// Helper to draw a trace
const drawTrace = (vertices: number[], color: [number, number, number, number]) => {
if (vertices.length < 4) return;
gl.useProgram(traceProgram);
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.DYNAMIC_DRAW);
gl.enableVertexAttribArray(tracePositionLocation);
gl.vertexAttribPointer(tracePositionLocation, 2, gl.FLOAT, false, 0, 0);
gl.uniform2f(traceResolutionLocation, width, height);
gl.uniform4f(traceColorLocation, color[0], color[1], color[2], color[3]);
gl.lineWidth(2);
gl.drawArrays(gl.LINE_STRIP, 0, vertices.length / 2);
};
// Function to render a single frame at a specific sample position
const renderFrameAtSample = (startSample: number): void => {
gl.clearColor(0, 0, 0, 1);
gl.clear(gl.COLOR_BUFFER_BIT);
const endSample = Math.min(startSample + samplesPerFrame, totalSamples);
const leftColor: [number, number, number, number] = [0, 1, 0, 1];
const rightColor: [number, number, number, number] = [0, 0.8, 1, 1];
const xyColor: [number, number, number, number] = [1, 0.5, 0, 1];
const dividerColor: [number, number, number, number] = [0.2, 0.2, 0.2, 1];
if (mode === 'combined') {
const vertices: number[] = [];
const samplesPerPixel = samplesPerFrame / width;
const centerY = height / 2;
for (let x = 0; x < width; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= totalSamples) break;
const sample = (audioData.leftChannel[sampleIndex] + audioData.rightChannel[sampleIndex]) / 2;
const y = centerY - sample * (height * 0.4);
vertices.push(x, y);
}
drawTrace(vertices, leftColor);
} else if (mode === 'separate') {
const halfHeight = height / 2;
const samplesPerPixel = samplesPerFrame / width;
// Left channel (top half)
const leftVertices: number[] = [];
const leftCenterY = halfHeight / 2;
for (let x = 0; x < width; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= totalSamples) break;
const sample = audioData.leftChannel[sampleIndex];
const y = leftCenterY - sample * (halfHeight * 0.35);
leftVertices.push(x, y);
}
drawTrace(leftVertices, leftColor);
// Right channel (bottom half)
const rightVertices: number[] = [];
const rightCenterY = halfHeight + halfHeight / 2;
for (let x = 0; x < width; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= totalSamples) break;
const sample = audioData.rightChannel[sampleIndex];
const y = rightCenterY - sample * (halfHeight * 0.35);
rightVertices.push(x, y);
}
drawTrace(rightVertices, rightColor);
// Divider
drawTrace([0, halfHeight, width, halfHeight], dividerColor);
} else if (mode === 'all') {
const topHeight = height / 2;
const bottomHeight = height / 2;
const halfWidth = width / 2;
const samplesPerPixel = samplesPerFrame / halfWidth;
// Left channel (top-left)
const leftVertices: number[] = [];
const leftCenterY = topHeight / 2;
for (let x = 0; x < halfWidth; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= totalSamples) break;
const sample = audioData.leftChannel[sampleIndex];
const y = leftCenterY - sample * (topHeight * 0.35);
leftVertices.push(x, y);
}
drawTrace(leftVertices, leftColor);
// Right channel (top-right)
const rightVertices: number[] = [];
const rightCenterY = topHeight / 2;
for (let x = 0; x < halfWidth; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= totalSamples) break;
const sample = audioData.rightChannel[sampleIndex];
const y = rightCenterY - sample * (topHeight * 0.35);
rightVertices.push(halfWidth + x, y);
}
drawTrace(rightVertices, rightColor);
// XY mode (bottom half)
const xyVertices: number[] = [];
const xyCenterX = width / 2;
const xyCenterY = topHeight + bottomHeight / 2;
const xyScale = Math.min(halfWidth, bottomHeight) * 0.35;
for (let i = startSample; i < endSample; i++) {
const x = xyCenterX + audioData.leftChannel[i] * xyScale;
const y = xyCenterY - audioData.rightChannel[i] * xyScale;
xyVertices.push(x, y);
}
drawTrace(xyVertices, xyColor);
// Dividers
drawTrace([0, topHeight, width, topHeight], dividerColor);
drawTrace([halfWidth, 0, halfWidth, topHeight], dividerColor);
}
};
// Capture stream at the target FPS
const videoStream = canvas.captureStream(fps);
// Decode audio
let audioContext: AudioContext;
try {
audioContext = new AudioContext({ sampleRate: audioData.sampleRate });
} catch {
log('AudioContext({sampleRate}) failed; falling back to default AudioContext()');
audioContext = new AudioContext();
}
await audioContext.resume();
const audioArrayBuffer = await audioFile.arrayBuffer();
const audioBuffer = await audioContext.decodeAudioData(audioArrayBuffer);
log('decoded audio', {
ctxSampleRate: audioContext.sampleRate,
duration: audioBuffer.duration,
channels: audioBuffer.numberOfChannels,
});
const audioSource = audioContext.createBufferSource();
audioSource.buffer = audioBuffer;
const audioDestination = audioContext.createMediaStreamDestination();
audioSource.connect(audioDestination);
const combinedStream = new MediaStream([
...videoStream.getVideoTracks(),
...audioDestination.stream.getAudioTracks(),
]);
// Prefer VP8 for broad compatibility
let mimeType = 'video/webm;codecs=vp8,opus';
if (!MediaRecorder.isTypeSupported(mimeType)) {
mimeType = 'video/webm;codecs=vp9,opus';
}
if (!MediaRecorder.isTypeSupported(mimeType)) {
mimeType = 'video/webm';
}
log('MediaRecorder setup', {
requestedMimeType: mimeType,
videoBitsPerSecond: 8000000,
audioBitsPerSecond: 256000,
});
const mediaRecorder = new MediaRecorder(combinedStream, {
mimeType,
videoBitsPerSecond: 8000000,
audioBitsPerSecond: 256000,
});
const chunks: Blob[] = [];
let chunkBytes = 0;
mediaRecorder.onstart = () =>
log('MediaRecorder onstart', { state: mediaRecorder.state, mimeType: mediaRecorder.mimeType });
mediaRecorder.ondataavailable = (e) => {
const size = e?.data?.size ?? 0;
log('MediaRecorder ondataavailable', {
size,
type: e?.data?.type,
recorderState: mediaRecorder.state,
});
if (e.data && e.data.size > 0) {
chunks.push(e.data);
chunkBytes += e.data.size;
}
};
return new Promise<string>((resolve, reject) => {
let stopped = false;
let stopReason: string = 'unknown';
let lastRenderedFrame = -1;
let lastLoggedSecond = -1;
let rafId = 0;
let safetyTimer: number | null = null;
const stopRecorder = (reason: string) => {
if (stopped) return;
stopped = true;
stopReason = reason;
log('stopRecorder()', {
reason,
recorderState: mediaRecorder.state,
chunks: chunks.length,
chunkBytes,
});
if (rafId) cancelAnimationFrame(rafId);
if (safetyTimer) window.clearTimeout(safetyTimer);
if (reason === 'cancel') {
try {
audioSource.stop();
} catch {
// ignore
}
}
try {
if (mediaRecorder.state === 'recording') {
log('calling mediaRecorder.stop()');
mediaRecorder.stop();
}
} catch (e) {
log('mediaRecorder.stop() failed', e);
}
};
audioSource.onended = () => {
log('audioSource.onended');
try {
const endSample = Math.max(0, totalSamples - samplesPerFrame);
renderFrameAtSample(endSample);
} catch (e) {
log('final frame render failed', e);
}
stopRecorder('audio_ended');
};
mediaRecorder.onstop = async () => {
log('MediaRecorder onstop', { stopReason, chunks: chunks.length, chunkBytes });
// Cleanup WebGL
gl.deleteProgram(traceProgram);
gl.deleteBuffer(positionBuffer);
try {
await audioContext.close();
} catch {
// ignore
}
try {
combinedStream.getTracks().forEach((t) => t.stop());
} catch {
// ignore
}
const finalMime = mediaRecorder.mimeType || mimeType;
const blob = new Blob(chunks, { type: finalMime });
log('final blob', {
mime: finalMime,
blobSize: blob.size,
chunks: chunks.length,
chunkBytes,
});
if (blob.size === 0) {
setIsExporting(false);
reject(new Error('Export failed: empty recording blob'));
return;
}
const url = URL.createObjectURL(blob);
setExportedUrl(url);
setIsExporting(false);
setProgress(100);
resolve(url);
};
mediaRecorder.onerror = (e) => {
log('MediaRecorder onerror', e);
setIsExporting(false);
reject(e);
};
// Start without timeslice - this creates a single continuous WebM file
mediaRecorder.start();
log('mediaRecorder.start() called', { state: mediaRecorder.state, mimeType: mediaRecorder.mimeType });
const exportStart = audioContext.currentTime;
audioSource.start(0);
log('audioSource.start() called', { exportStart, duration: audioBuffer.duration });
// Safety timeout: for very long files (6+ hours = 21600+ seconds), add generous buffer
const safetyDuration = Math.ceil(audioBuffer.duration * 1000 + 30000); // 30s buffer
log('safety timer set', { safetyDuration, durationSeconds: audioBuffer.duration });
safetyTimer = window.setTimeout(() => {
log('safety timeout hit');
stopRecorder('safety_timeout');
}, safetyDuration);
const renderLoop = () => {
if (stopped) return;
if (cancelRef.current) {
log('cancelRef triggered');
stopRecorder('cancel');
return;
}
const t = Math.max(0, audioContext.currentTime - exportStart);
// Heartbeat every 10 seconds for long exports
const sec = Math.floor(t / 10) * 10;
if (sec !== lastLoggedSecond && sec > 0) {
lastLoggedSecond = sec;
log('heartbeat', {
t: t.toFixed(1),
duration: audioBuffer.duration.toFixed(1),
percentComplete: ((t / audioBuffer.duration) * 100).toFixed(1),
recorderState: mediaRecorder.state,
chunks: chunks.length,
chunkBytes,
});
}
// Guard: if audio should have ended but didn't, stop
if (t > audioBuffer.duration + 2) {
log('duration guard hit', { t, duration: audioBuffer.duration });
stopRecorder('duration_guard');
return;
}
const frameIndex = Math.floor(t * fps);
if (frameIndex !== lastRenderedFrame) {
const startSample = Math.min(frameIndex * samplesPerFrame, totalSamples - 1);
renderFrameAtSample(startSample);
lastRenderedFrame = frameIndex;
// Update progress less frequently for performance
if (frameIndex % 60 === 0) {
setProgress(Math.min(99, Math.floor((startSample / totalSamples) * 100)));
}
}
rafId = requestAnimationFrame(renderLoop);
};
rafId = requestAnimationFrame(renderLoop);
});
}, []);
const reset = useCallback(() => {
if (exportedUrl) {
URL.revokeObjectURL(exportedUrl);
}
cancelRef.current = true;
setExportedUrl(null);
setProgress(0);
}, [exportedUrl]);
return {
isExporting,
progress,
exportedUrl,
exportVideo,
reset,
};
}

View File

@ -111,6 +111,8 @@ const Index = () => {
setIsRedTheme(!isRedTheme); setIsRedTheme(!isRedTheme);
playSound('click'); playSound('click');
unlockAchievement('theme_switcher'); unlockAchievement('theme_switcher');
// Notify other components of theme change
window.dispatchEvent(new CustomEvent('themeChange', { detail: { isRedTheme: !isRedTheme } }));
}; };
const handleConsentClose = () => { const handleConsentClose = () => {

View File

@ -0,0 +1,17 @@
import { motion } from 'framer-motion';
import { Oscilloscope } from '@/components/Oscilloscope';
const OscilloscopePage = () => {
return (
<motion.div
initial={{ opacity: 0 }}
animate={{ opacity: 1 }}
transition={{ duration: 0.5 }}
className="space-y-6"
>
<Oscilloscope />
</motion.div>
);
};
export default OscilloscopePage;

454
videoExportTestApi.ts Executable file
View File

@ -0,0 +1,454 @@
/**
* Video Export Test API
*
* Exposes a global API for automated testing of video exports.
*
* Usage in browser console or automated tests:
*
* // Run export with a test audio file
* const result = await window.VideoExportTestAPI.runExport(audioFileBlob, {
* width: 1920,
* height: 1080,
* fps: 60,
* mode: 'combined'
* });
*
* // result = { success: boolean, url?: string, error?: string, stats: {...} }
*
* // Download the result
* window.VideoExportTestAPI.downloadBlob(result.url, 'test-output.webm');
*
* // Validate the blob (basic checks)
* const validation = await window.VideoExportTestAPI.validateBlob(result.url);
* // validation = { valid: boolean, size: number, type: string, issues: string[] }
*/
import type { OscilloscopeMode } from '../hooks/useOscilloscopeRenderer';
export interface TestExportOptions {
width?: number;
height?: number;
fps?: number;
mode?: OscilloscopeMode;
}
export interface TestExportResult {
success: boolean;
url?: string;
error?: string;
stats: {
duration: number;
blobSize: number;
mimeType: string;
exportTimeMs: number;
};
}
export interface ValidationResult {
valid: boolean;
size: number;
type: string;
issues: string[];
}
// Simple audio analyzer for test purposes
async function analyzeAudio(file: File): Promise<{
leftChannel: Float32Array;
rightChannel: Float32Array;
sampleRate: number;
}> {
const audioContext = new AudioContext();
const arrayBuffer = await file.arrayBuffer();
const audioBuffer = await audioContext.decodeAudioData(arrayBuffer);
const leftChannel = audioBuffer.getChannelData(0);
const rightChannel = audioBuffer.numberOfChannels > 1
? audioBuffer.getChannelData(1)
: leftChannel;
await audioContext.close();
return {
leftChannel,
rightChannel,
sampleRate: audioBuffer.sampleRate,
};
}
class VideoExportTestAPIClass {
async runExport(
audioFile: File | Blob,
options: TestExportOptions = {}
): Promise<TestExportResult> {
const startTime = performance.now();
const file = audioFile instanceof File
? audioFile
: new File([audioFile], 'test-audio.mp3', { type: audioFile.type });
const opts = {
width: options.width ?? 1920,
height: options.height ?? 1080,
fps: options.fps ?? 60,
mode: options.mode ?? 'combined' as OscilloscopeMode,
};
console.log('[VideoExportTestAPI] Starting export with options:', opts);
try {
// Analyze audio
const audioData = await analyzeAudio(file);
console.log('[VideoExportTestAPI] Audio analyzed:', {
sampleRate: audioData.sampleRate,
duration: audioData.leftChannel.length / audioData.sampleRate,
samples: audioData.leftChannel.length,
});
// Execute export
const url = await this.executeExport(audioData, file, opts);
const blob = await fetch(url).then(r => r.blob());
const exportTimeMs = performance.now() - startTime;
const result: TestExportResult = {
success: true,
url,
stats: {
duration: audioData.leftChannel.length / audioData.sampleRate,
blobSize: blob.size,
mimeType: blob.type,
exportTimeMs,
},
};
console.log('[VideoExportTestAPI] Export completed:', result);
return result;
} catch (error) {
const exportTimeMs = performance.now() - startTime;
const result: TestExportResult = {
success: false,
error: error instanceof Error ? error.message : String(error),
stats: {
duration: 0,
blobSize: 0,
mimeType: '',
exportTimeMs,
},
};
console.error('[VideoExportTestAPI] Export failed:', result);
return result;
}
}
private async executeExport(
audioData: { leftChannel: Float32Array; rightChannel: Float32Array; sampleRate: number },
audioFile: File,
options: { width: number; height: number; fps: number; mode: OscilloscopeMode }
): Promise<string> {
const { width, height, fps, mode } = options;
const totalSamples = audioData.leftChannel.length;
const samplesPerFrame = Math.floor(audioData.sampleRate / fps);
const log = (...args: unknown[]) => {
console.log('[VideoExportTestAPI]', ...args);
};
// Create canvas
const canvas = document.createElement('canvas');
canvas.width = width;
canvas.height = height;
const ctx = canvas.getContext('2d');
if (!ctx) throw new Error('Could not get 2D context');
const leftColor = '#00ff00';
const rightColor = '#00ccff';
const xyColor = '#ff8800';
const dividerColor = '#333333';
const renderFrame = (startSample: number) => {
ctx.fillStyle = 'black';
ctx.fillRect(0, 0, width, height);
ctx.lineWidth = 2;
const endSample = Math.min(startSample + samplesPerFrame, totalSamples);
if (mode === 'combined') {
ctx.strokeStyle = leftColor;
ctx.beginPath();
const samplesPerPixel = samplesPerFrame / width;
const centerY = height / 2;
for (let x = 0; x < width; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= totalSamples) break;
const sample = (audioData.leftChannel[sampleIndex] + audioData.rightChannel[sampleIndex]) / 2;
const y = centerY - sample * (height * 0.4);
if (x === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
} else if (mode === 'separate') {
const halfHeight = height / 2;
const samplesPerPixel = samplesPerFrame / width;
// Left (top)
ctx.strokeStyle = leftColor;
ctx.beginPath();
const leftCenterY = halfHeight / 2;
for (let x = 0; x < width; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= totalSamples) break;
const sample = audioData.leftChannel[sampleIndex];
const y = leftCenterY - sample * (halfHeight * 0.35);
if (x === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
// Right (bottom)
ctx.strokeStyle = rightColor;
ctx.beginPath();
const rightCenterY = halfHeight + halfHeight / 2;
for (let x = 0; x < width; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= totalSamples) break;
const sample = audioData.rightChannel[sampleIndex];
const y = rightCenterY - sample * (halfHeight * 0.35);
if (x === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
// Divider
ctx.strokeStyle = dividerColor;
ctx.beginPath();
ctx.moveTo(0, halfHeight);
ctx.lineTo(width, halfHeight);
ctx.stroke();
} else if (mode === 'all') {
const topHeight = height / 2;
const bottomHeight = height / 2;
const halfWidth = width / 2;
const samplesPerPixel = samplesPerFrame / halfWidth;
// Left (top-left)
ctx.strokeStyle = leftColor;
ctx.beginPath();
const leftCenterY = topHeight / 2;
for (let x = 0; x < halfWidth; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= totalSamples) break;
const sample = audioData.leftChannel[sampleIndex];
const y = leftCenterY - sample * (topHeight * 0.35);
if (x === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
// Right (top-right)
ctx.strokeStyle = rightColor;
ctx.beginPath();
const rightCenterY = topHeight / 2;
for (let x = 0; x < halfWidth; x++) {
const sampleIndex = Math.floor(startSample + x * samplesPerPixel);
if (sampleIndex >= totalSamples) break;
const sample = audioData.rightChannel[sampleIndex];
const y = rightCenterY - sample * (topHeight * 0.35);
if (x === 0) ctx.moveTo(halfWidth + x, y);
else ctx.lineTo(halfWidth + x, y);
}
ctx.stroke();
// XY (bottom half)
ctx.strokeStyle = xyColor;
ctx.beginPath();
const xyCenterX = width / 2;
const xyCenterY = topHeight + bottomHeight / 2;
const xyScale = Math.min(halfWidth, bottomHeight) * 0.35;
for (let i = startSample; i < endSample; i++) {
const x = xyCenterX + audioData.leftChannel[i] * xyScale;
const y = xyCenterY - audioData.rightChannel[i] * xyScale;
if (i === startSample) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
}
ctx.stroke();
// Dividers
ctx.strokeStyle = dividerColor;
ctx.beginPath();
ctx.moveTo(0, topHeight);
ctx.lineTo(width, topHeight);
ctx.stroke();
ctx.beginPath();
ctx.moveTo(halfWidth, 0);
ctx.lineTo(halfWidth, topHeight);
ctx.stroke();
}
};
// Setup recording
const videoStream = canvas.captureStream(fps);
const audioContext = new AudioContext();
await audioContext.resume();
const audioArrayBuffer = await audioFile.arrayBuffer();
const audioBuffer = await audioContext.decodeAudioData(audioArrayBuffer);
const audioSource = audioContext.createBufferSource();
audioSource.buffer = audioBuffer;
const audioDestination = audioContext.createMediaStreamDestination();
audioSource.connect(audioDestination);
const combinedStream = new MediaStream([
...videoStream.getVideoTracks(),
...audioDestination.stream.getAudioTracks(),
]);
let mimeType = 'video/webm;codecs=vp8,opus';
if (!MediaRecorder.isTypeSupported(mimeType)) {
mimeType = 'video/webm;codecs=vp9,opus';
}
if (!MediaRecorder.isTypeSupported(mimeType)) {
mimeType = 'video/webm';
}
const mediaRecorder = new MediaRecorder(combinedStream, {
mimeType,
videoBitsPerSecond: 8000000,
audioBitsPerSecond: 256000,
});
const chunks: Blob[] = [];
return new Promise<string>((resolve, reject) => {
let stopped = false;
const stopRecorder = (reason: string) => {
if (stopped) return;
stopped = true;
log('stopRecorder', reason);
if (mediaRecorder.state === 'recording') {
mediaRecorder.stop();
}
};
mediaRecorder.ondataavailable = (e) => {
log('ondataavailable', { size: e.data?.size, type: e.data?.type });
if (e.data && e.data.size > 0) {
chunks.push(e.data);
}
};
mediaRecorder.onstop = async () => {
log('onstop', { chunks: chunks.length });
await audioContext.close();
combinedStream.getTracks().forEach(t => t.stop());
const blob = new Blob(chunks, { type: mimeType });
log('final blob', { size: blob.size });
if (blob.size === 0) {
reject(new Error('Empty blob'));
return;
}
resolve(URL.createObjectURL(blob));
};
mediaRecorder.onerror = (e) => reject(e);
audioSource.onended = () => {
log('audioSource.onended');
renderFrame(Math.max(0, totalSamples - samplesPerFrame));
stopRecorder('audio_ended');
};
// Start recording
mediaRecorder.start();
const exportStart = audioContext.currentTime;
audioSource.start(0);
log('started', { duration: audioBuffer.duration });
// Safety timeout
setTimeout(() => stopRecorder('timeout'), (audioBuffer.duration + 30) * 1000);
// Render loop
let lastFrame = -1;
const loop = () => {
if (stopped) return;
const t = Math.max(0, audioContext.currentTime - exportStart);
const frameIndex = Math.floor(t * fps);
if (frameIndex !== lastFrame) {
renderFrame(Math.min(frameIndex * samplesPerFrame, totalSamples - 1));
lastFrame = frameIndex;
}
requestAnimationFrame(loop);
};
requestAnimationFrame(loop);
});
}
async validateBlob(url: string): Promise<ValidationResult> {
const issues: string[] = [];
try {
const response = await fetch(url);
const blob = await response.blob();
if (blob.size === 0) {
issues.push('Blob is empty');
}
if (!blob.type.includes('webm')) {
issues.push(`Unexpected MIME type: ${blob.type}`);
}
// Check WebM magic bytes
const header = await blob.slice(0, 4).arrayBuffer();
const bytes = new Uint8Array(header);
// WebM starts with 0x1A 0x45 0xDF 0xA3 (EBML header)
if (bytes[0] !== 0x1A || bytes[1] !== 0x45 || bytes[2] !== 0xDF || bytes[3] !== 0xA3) {
issues.push('Invalid WebM header (missing EBML magic bytes)');
}
return {
valid: issues.length === 0,
size: blob.size,
type: blob.type,
issues,
};
} catch (error) {
return {
valid: false,
size: 0,
type: '',
issues: [error instanceof Error ? error.message : String(error)],
};
}
}
downloadBlob(url: string, filename: string = 'test-export.webm') {
const a = document.createElement('a');
a.href = url;
a.download = filename;
a.click();
}
}
// Expose globally for testing
const api = new VideoExportTestAPIClass();
declare global {
interface Window {
VideoExportTestAPI: VideoExportTestAPIClass;
}
}
if (typeof window !== 'undefined') {
window.VideoExportTestAPI = api;
}
export const VideoExportTestAPI = api;