Big2Be es una plataforma de eventos y experiencias. Se necesita agregar un modulo de video completo estilo YouTube para que los creadores de contenido puedan subir videos y shorts relacionados con eventos, behind-the-scenes, promociones, y experiencias. El modulo incluye: upload, procesamiento con FFmpeg, streaming HLS adaptativo, comentarios, likes, shares, tracking de vistas, y moderacion de contenido con IA via OpenRouter.
Decisiones del usuario:
- Storage: AWS S3 (ya lo tienen configurado)
- Moderacion: OpenRouter (gateway AI multi-modelo con vision)
- Tipo de plataforma: Eventos y experiencias (Big2Be focused)
- Streaming: HLS Adaptive Streaming (calidad se adapta a la conexion)
[Upload Video] → [S3 raw/] → [Processing Job Queue]
│
┌───────────┴───────────┐
│ FFmpeg Pipeline │
│ │
│ 1. Extract metadata │
│ 2. Transcode HLS │
│ (360/480/720/1080) │
│ 3. Generate thumbnails│
│ 4. Extract frames │
└───────────┬───────────┘
│
┌───────────┴───────────┐
│ Content Moderation │
│ (OpenRouter Vision) │
└───────────┬───────────┘
│
┌───────┴───────┐
│ │
[SAFE] [FLAGGED]
│ │
Status: ready Status: flagged
(Publicado) (Admin review)
7 modelos MongoDB:
| Modelo | Coleccion | Proposito |
|---|---|---|
Video |
videos |
Video principal con metadata, URLs HLS, contadores |
VideoComment |
video_comments |
Comentarios con soporte para threads (parentId) |
VideoView |
video_views |
Tracking de vistas por usuario (duracion vista, completado) |
VideoLike |
video_likes |
Likes/dislikes unificados para videos Y comentarios |
VideoShare |
video_shares |
Registro de compartidos por plataforma |
VideoProcessingJob |
video_processing_jobs |
Estado del pipeline de procesamiento |
VideoReport |
video_reports |
Reportes de usuarios sobre contenido |
Modelo Video (campos clave):
type Video struct {
ID primitive.ObjectID
CreatorID primitive.ObjectID
Title, Slug, Description string
// HLS Streaming
RawVideoURL string // S3 raw/ - oculto del API (json:"-")
HLSMasterURL string // URL al master playlist .m3u8
HLSVariants map[string]string // {"360p": "url", "720p": "url", ...}
ThumbnailURL string
ThumbnailOptions []string // 3 opciones generadas
PreviewGifURL string // GIF de preview (hover)
// Metadata
Duration float64 // segundos
Width, Height int
FileSize int64
// Clasificacion - enfocado en Big2Be
Category string // "event-highlight", "behind-scenes", "promo", "tutorial", "review", "vlog"
Tags []string
EventID *primitive.ObjectID // Vinculo opcional a un evento de Big2Be
IsShort bool // Videos <= 60s
// Estado
Status string // uploading, processing, ready, failed, flagged
Visibility string // public, private, unlisted
// Contadores denormalizados
ViewCount, LikeCount, DislikeCount, CommentCount, ShareCount int64
// Moderacion
ModerationScore float64
IsFlagged bool
FlagReason string
}Categorias especificas de Big2Be:
const (
VideoCategoryEventHighlight = "event-highlight" // Highlights de eventos
VideoCategoryBehindScenes = "behind-scenes" // Detras de camaras
VideoCategoryPromo = "promo" // Promociones de eventos
VideoCategoryTutorial = "tutorial" // Tutoriales/how-to
VideoCategoryReview = "review" // Reviews de eventos
VideoCategoryVlog = "vlog" // Vlogs de creadores
VideoCategoryShorts = "shorts" // Contenido corto
)Agregar campos al struct Config:
// AWS S3
S3BucketName string
S3Region string
S3AccessKey string
S3SecretKey string
S3Endpoint string // Para localstack/minio en dev
// Video Processing
FFmpegPath string // default: "ffmpeg"
MaxVideoSizeBytes int64 // default: 500MB
MaxShortDurationSec int // default: 60
VideoProcessingWorkers int // default: 2
// Content Moderation
OpenRouterAPIKey string
OpenRouterModel string // default: "google/gemini-flash-1.5"
ModerationEnabled bool // default: true
ModerationThreshold float64 // default: 0.7Agregar permisos de video:
PermissionUploadVideo = "upload_video"
PermissionEditOwnVideo = "edit_own_video"
PermissionDeleteOwnVideo = "delete_own_video"
PermissionViewVideoStats = "view_video_stats"
PermissionModerateVideo = "moderate_video"Agregar a RoleContentCreator y RoleAdmin.
Agregar seccion de video con todas las variables.
Inicializar cliente AWS S3 como singleton global (config.S3Client), siguiendo el patron de config/firebase.go.
Abstraccion sobre S3:
Upload(ctx, objectPath, reader, contentType) errorDownload(ctx, objectPath) (io.ReadCloser, error)GetPresignedURL(ctx, objectPath, expiry) (string, error)- URLs firmadas para streaming seguroDelete(ctx, objectPath) errorGenerateUploadURL(ctx, objectPath, contentType, expiry) (string, error)- Upload directo desde cliente
Estructura de buckets en S3:
big2be-videos/
raw/{videoID}/original.{ext} # Video original
hls/{videoID}/master.m3u8 # HLS master playlist
hls/{videoID}/360p/playlist.m3u8 # HLS variant 360p
hls/{videoID}/360p/segment_000.ts # Segmentos
hls/{videoID}/480p/...
hls/{videoID}/720p/...
hls/{videoID}/1080p/...
thumbnails/{videoID}/thumb_0.jpg # 3 thumbnails
thumbnails/{videoID}/thumb_1.jpg
thumbnails/{videoID}/thumb_2.jpg
thumbnails/{videoID}/preview.gif # Preview animado
frames/{videoID}/frame_{N}.jpg # Frames para moderacion
Interface IVideoRepository con metodos para:
- CRUD de videos (create, get by ID, get by slug, update, soft delete, list con filtros)
- IncrementCounter (atomico con
$incpara viewCount, likeCount, etc.) - Comentarios (CRUD, listado paginado con threads)
- Likes (upsert con deteccion de like previo para toggle)
- Vistas (create, get por usuario+video, historial)
- Shares (create)
- Processing jobs (create, claim, update step, complete, fail)
- Reports (create, listar pendientes)
- Feed y trending (aggregation pipelines)
Interface IVideoService con metodos de alto nivel.
Implementacion con 7 colecciones MongoDB:
type VideoRepository struct {
*BaseRepository // "videos"
commentsCollection *mongo.Collection // "video_comments"
likesCollection *mongo.Collection // "video_likes"
viewsCollection *mongo.Collection // "video_views"
sharesCollection *mongo.Collection // "video_shares"
jobsCollection *mongo.Collection // "video_processing_jobs"
reportsCollection *mongo.Collection // "video_reports"
}Indices criticos (agregar en config/database.go):
videos: {creatorId, createdAt}, {status, visibility, createdAt}, {slug} UNIQUE, {tags}, {category}, {eventId}, TEXT(title, description, tags)
video_likes: {userId, targetId, targetType} UNIQUE
video_views: {videoId, userId}, {userId, createdAt}
video_comments: {videoId, parentId, createdAt}
video_processing_jobs: {status, createdAt}, {videoId}
Trending pipeline (aggregation):
score = views*1 + likes*3 + comments*5 + shares*7
filtrar: status=ready, visibility=public, createdAt >= 7 dias atras
ordenar por score DESC
Servicio principal con toda la logica de negocio:
Upload flow:
- Validar archivo (tamano, tipo: mp4/mov/avi/webm/mkv)
- Generar videoID y slug unico
- Subir video raw a S3:
raw/{videoID}/original.{ext} - Crear documento Video con
status: "uploading" - Crear VideoProcessingJob con
status: "pending" - Actualizar video
status: "processing" - Retornar video (el cliente hace polling del status)
View tracking:
- Verificar si el usuario ya vio en los ultimos 30 min
- Si no, crear VideoView e incrementar
viewCountcon$inc
Like/dislike (toggle atomico):
- Upsert like en video_likes
- Si habia like previo del mismo tipo → no hacer nada (idempotente)
- Si habia like previo de tipo diferente → decrementar viejo, incrementar nuevo
- Si no habia like → incrementar contador correspondiente
Feed personalizado:
- Videos de creadores que el usuario sigue
- Videos relacionados con eventos que el usuario ha comprado tickets
- Videos trending en categorias de interes
- Mezclar y ordenar por relevancia + recencia
Pipeline de procesamiento con FFmpeg:
Paso 1 - Metadata:
ffprobe -v quiet -print_format json -show_format -show_streams input.mp4Extraer: duration, width, height, codec, bitrate
Paso 2 - HLS Transcoding (adaptativo):
# Para cada calidad (solo si source >= calidad)
ffmpeg -i input.mp4 \
-c:v libx264 -preset medium -crf 23 \
-c:a aac -b:a 128k \
-vf scale=-2:{height} \
-hls_time 6 \
-hls_playlist_type vod \
-hls_segment_filename '{videoID}/{quality}/segment_%03d.ts' \
'{videoID}/{quality}/playlist.m3u8'Calidades con bitrates:
| Calidad | Resolucion | Video Bitrate | Audio |
|---|---|---|---|
| 360p | -2:360 | 800k | 96k |
| 480p | -2:480 | 1400k | 128k |
| 720p | -2:720 | 2800k | 128k |
| 1080p | -2:1080 | 5000k | 192k |
Paso 3 - Master Playlist:
Generar master.m3u8 que referencia todas las variantes:
#EXTM3U
#EXT-X-STREAM-INF:BANDWIDTH=896000,RESOLUTION=640x360
360p/playlist.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1528000,RESOLUTION=854x480
480p/playlist.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=2928000,RESOLUTION=1280x720
720p/playlist.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=5192000,RESOLUTION=1920x1080
1080p/playlist.m3u8Paso 4 - Thumbnails:
# 3 thumbnails en 10%, 50%, 90% de la duracion
ffmpeg -i input.mp4 -ss {time} -vframes 1 -q:v 2 thumb_{N}.jpgPaso 5 - Preview GIF:
ffmpeg -i input.mp4 -ss 0 -t 3 -vf "fps=10,scale=320:-1" -loop 0 preview.gifPaso 6 - Frame extraction:
# 1 frame cada 5 segundos para moderacion
ffmpeg -i input.mp4 -vf "fps=1/5" frame_%04d.jpgPaso 7 - Upload outputs a S3
Paso 8 - Content moderation (frames)
Paso 9 - Actualizar Video con URLs y status
Paso 10 - Cleanup temp directory
Integracion con OpenRouter para analizar frames:
type ContentModerationService struct {
apiKey string
model string // "google/gemini-flash-1.5" (rapido y barato para vision)
enabled bool
threshold float64
}Flow:
- Para cada frame extraido, enviar a OpenRouter con prompt de moderacion
- Prompt: "Analyze this image for inappropriate content. Rate: adult (0-1), violence (0-1), hate (0-1), drugs (0-1). Return JSON."
- Si alguna categoria > threshold → video flagged
- Optimizacion: analizar frames en batches de 5, parar al primer flag
Request a OpenRouter:
POST https://openrouter.ai/api/v1/chat/completions
{
"model": "google/gemini-flash-1.5",
"messages": [{
"role": "user",
"content": [
{"type": "text", "text": "Analyze this image..."},
{"type": "image_url", "image_url": {"url": "data:image/jpeg;base64,..."}}
]
}]
}Request/Response DTOs + handlers siguiendo el patron de event_controller.go:
type VideoController struct {
videoService *services.VideoService
}IMPORTANTE: Rutas fijas (/feed, /trending, /shorts) ANTES de /:id para evitar colision en Fiber.
=== RUTAS PUBLICAS ===
GET /api/v1/videos → ListVideos (paginado, filtros)
GET /api/v1/videos/trending → GetTrending
GET /api/v1/videos/shorts → ListShorts
GET /api/v1/videos/category/:category → ListByCategory
GET /api/v1/videos/event/:eventId → ListByEvent
GET /api/v1/videos/:id → GetVideo (OptionalAuth, trackea vista)
GET /api/v1/videos/:id/comments → GetComments (paginado, threads)
GET /api/v1/videos/:id/share-link → GetShareLink
=== RUTAS PROTEGIDAS (Auth required) ===
POST /api/v1/videos → UploadVideo (RequirePermission: upload_video)
PUT /api/v1/videos/:id → UpdateVideo (owner o admin)
DELETE /api/v1/videos/:id → DeleteVideo (owner o admin)
POST /api/v1/videos/:id/like → LikeVideo
DELETE /api/v1/videos/:id/like → RemoveLike
POST /api/v1/videos/:id/share → RegisterShare
POST /api/v1/videos/:id/report → ReportVideo
POST /api/v1/videos/:id/comments → AddComment
PUT /api/v1/videos/comments/:commentId → EditComment
DELETE /api/v1/videos/comments/:commentId → DeleteComment
POST /api/v1/videos/comments/:commentId/like → LikeComment
GET /api/v1/videos/feed → GetFeed (personalizado)
GET /api/v1/users/me/watch-history → GetWatchHistory
POST /api/v1/videos/:id/thumbnail → SelectThumbnail (owner)
=== RUTAS ADMIN ===
GET /api/v1/admin/videos/all → AdminListAllVideos
GET /api/v1/admin/videos/flagged → AdminGetFlaggedVideos
GET /api/v1/admin/videos/reports → AdminGetReports
PUT /api/v1/admin/videos/:id/moderate → AdminModerateVideo (aprobar/rechazar)
GET /api/v1/admin/videos/processing-jobs → AdminGetProcessingJobs
GET /api/v1/admin/videos/stats → AdminGetVideoStats
Siguiendo el patron de jobs/blockchain_registration_job.go:
type VideoProcessingJobRunner struct {
processingService *services.VideoProcessingService
cron *cron.Cron
workerCount int
mu sync.Mutex
running bool
}- Poll cada 10 segundos para jobs pendientes
- Limitar workers concurrentes (configurable, default: 2)
- Claim job con atomicidad (findOneAndUpdate con status: pending → processing)
- Retry automatico hasta maxRetries (3)
- Panic recovery por worker
- Logging detallado por step
Agregar:
// Fields
videoRepo iface_repos.IVideoRepository
videoService iface_services.IVideoService
// initializeRepositories()
c.videoRepo = repositories.NewVideoRepository()
// initializeDependentServices()
c.videoService = services.NewVideoService()
// Getters + Reset// Despues de InitFirebase():
config.InitS3()
// Despues de blockchain job:
videoProcessingJob := jobs.NewVideoProcessingJobRunner(config.DB)
videoProcessingJob.Start()
// En routes:
routes.SetupVideoRoutes(app)
// Fiber config - aumentar body limit para videos:
app := fiber.New(fiber.Config{
BodyLimit: int(config.AppConfig.MaxVideoSizeBytes), // 500MB
// ... existing config
})Agregar todos los indices de video en createIndexes().
| # | Archivo | Accion | Depende de |
|---|---|---|---|
| 1 | models/video.go |
CREAR | - |
| 2 | models/role.go |
MODIFICAR | - |
| 3 | config/config.go |
MODIFICAR | - |
| 4 | .env.example |
MODIFICAR | - |
| 5 | config/s3.go |
CREAR | #3 |
| 6 | services/storage_service.go |
CREAR | #5 |
| 7 | services/content_moderation_service.go |
CREAR | #3 |
| 8 | interfaces/repositories/video_repository.go |
CREAR | #1 |
| 9 | interfaces/services/video_service.go |
CREAR | #1 |
| 10 | repositories/video_repository.go |
CREAR | #1, #8 |
| 11 | config/database.go |
MODIFICAR | #10 |
| 12 | services/video_service.go |
CREAR | #6, #10 |
| 13 | services/video_processing_service.go |
CREAR | #6, #7, #10 |
| 14 | controllers/video_controller.go |
CREAR | #12 |
| 15 | routes/video.go |
CREAR | #14 |
| 16 | jobs/video_processing_job.go |
CREAR | #13 |
| 17 | container/container.go |
MODIFICAR | #8, #9, #10, #12 |
| 18 | cmd/api/main.go |
MODIFICAR | #5, #15, #16, #17 |
# AWS SDK para S3
go get github.com/aws/aws-sdk-go-v2
go get github.com/aws/aws-sdk-go-v2/config
go get github.com/aws/aws-sdk-go-v2/service/s3
go get github.com/aws/aws-sdk-go-v2/credentials
# Slug generation
go get github.com/gosimple/slugRequisitos del servidor:
- FFmpeg y FFprobe instalados (
apt-get install -y ffmpegen Docker) - ~3GB espacio temporal por worker para procesamiento de video
go build ./...
go vet ./...- Upload video via
POST /api/v1/videos(multipart con archivo .mp4) - Verificar que el processing job se crea y ejecuta
- Verificar que HLS master playlist se genera en S3
- Probar streaming con un player HLS (video.js, hls.js)
- Verificar que los frames se extraen y moderan
- Probar comentarios, likes, shares
- Verificar feed y trending
# Upload
curl -X POST /api/v1/videos -F "video=@test.mp4" -F "title=Test" -F "category=event-highlight"
# Get video con HLS URLs
curl /api/v1/videos/{id}
# Trending
curl /api/v1/videos/trending
# Comments
curl -X POST /api/v1/videos/{id}/comments -d '{"content":"Great video!"}'models/video.goconfig/s3.goservices/storage_service.goservices/content_moderation_service.gointerfaces/repositories/video_repository.gointerfaces/services/video_service.gorepositories/video_repository.goservices/video_service.goservices/video_processing_service.gocontrollers/video_controller.goroutes/video.gojobs/video_processing_job.go
config/config.go- Agregar S3, FFmpeg, moderation configmodels/role.go- Agregar permisos de videoconfig/database.go- Agregar indices de videocontainer/container.go- Registrar video repo/servicecmd/api/main.go- Init S3, routes, processing job, body limit.env.example- Agregar variables de video