当前位置: 首页>>代码示例>>Java>>正文


Java IVideoPicture类代码示例

本文整理汇总了Java中com.xuggle.xuggler.IVideoPicture的典型用法代码示例。如果您正苦于以下问题:Java IVideoPicture类的具体用法?Java IVideoPicture怎么用?Java IVideoPicture使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


IVideoPicture类属于com.xuggle.xuggler包,在下文中一共展示了IVideoPicture类的15个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: decodeFrame

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
@Override
public boolean decodeFrame() {
	try {
		IVideoPicture picture = null;
		if(!(pictures.tryAcquire(1000, TimeUnit.MILLISECONDS))) {
			if(decoderThread.isAlive())
				return false;
			rewind();
			if(numPlays > 0)
				decodeFrame();
			return numPlays <= 0;
		}
		picture     = getPictureFromQ();
		playOutTime = baseTime + (picture.getTimeStamp() / IScheduler.SEC2US);
		isKeyframe  = picture.isKeyFrame();
		this.currentPicture.set(picture);
		return true;
	} catch (Throwable t) {
		return false;
	}
}
 
开发者ID:arisona,项目名称:ether,代码行数:22,代码来源:XuggleAccess.java

示例2: onVideoPicture

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
@Override
public void onVideoPicture(IVideoPictureEvent event) {
	// < 0 means the file we are rolling off of has < RECORD_LENGTH
	// seconds of footage
	if (event.getTimeStamp() >= startingTimestamp || startingTimestamp < 0) {
		final IVideoPicture picture = event.getPicture();

		if (startTimestamp == -1) {
			startTimestamp = picture.getTimeStamp();
		}

		lastTimestamp = picture.getTimeStamp() - startTimestamp;
		picture.setTimeStamp(lastTimestamp);

		writer.encodeVideo(0, picture);
	}
}
 
开发者ID:phrack,项目名称:ShootOFF,代码行数:18,代码来源:RollingRecorder.java

示例3: addImage

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
/**
 * Adds an image as a frame to the current video
 * @param image
 */
private void addImage(BufferedImage image){
	IPacket packet = IPacket.make();
	IConverter converter = ConverterFactory.createConverter(image, coder.getPixelType());
	IVideoPicture frame = converter.toPicture(image, Math.round(frameTime));
	
	if (coder.encodeVideo(packet, frame, 0) < 0) {
		throw new RuntimeException("Unable to encode video.");
	}
	
	if (packet.isComplete()) {
		if (writer.writePacket(packet) < 0) {
			throw new RuntimeException("Could not write packet to container.");
		}
	}
       this.frameTime += 1000000f/frameRate;
       frameCount++;
}
 
开发者ID:sensorstorm,项目名称:StormCV,代码行数:22,代码来源:StreamWriter.java

示例4: encode

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
public void encode( AVPacket packet ) {
    IPacket ipacket = (IPacket)packet.getPacket();
    int streamNum = ipacket.getStreamIndex();
    ipacket = IPacket.make();
    printPacketInfo(ipacket);
    IVideoPicture picture = (IVideoPicture)packet.getDecodedObject();
    if( picture == null ) {
        LOG.debug("Picture is null" );
        return;
    } else
    {
        printPictureInfo(picture);
    }
    if( streamNum < outStreams.length ) {
        LOG.debug( "Coder Bitrate: " + outStreams[streamNum].getStreamCoder().getBitRate());
        LOG.debug( "Coder Name: " + outStreams[streamNum].getStreamCoder().getCodec().getName());
        
        int retVal = outStreams[streamNum].getStreamCoder().encodeVideo( ipacket, picture, -1);
        LOG.debug("encoding, retval = " + retVal );
        if( ipacket.isComplete() )
        {
            //packet = new XugglerPacket( ipacket, ICodec.Type.CODEC_TYPE_VIDEO );
            packet.setPacket( ipacket );
        }
    }
}
 
开发者ID:openpreserve,项目名称:video-batch,代码行数:27,代码来源:XugglerOutputStream.java

示例5: encodeImage

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
/**
  * Encodes an image and writes it to the output stream.
  * 
  * @param image the image to encode (must be BufferedImage.TYPE_3BYTE_BGR)
  * @param pixelType the pixel type
  * @param timeStamp the time stamp in microseconds
  * @throws IOException
  */
private boolean encodeImage(BufferedImage image, IPixelFormat.Type pixelType, long timeStamp) 
		throws IOException {
	// convert image to xuggle picture
	IVideoPicture picture = getPicture(image, pixelType, timeStamp);
	if (picture==null)
		throw new RuntimeException("could not convert to picture"); //$NON-NLS-1$
	// make a packet
	IPacket packet = IPacket.make();
	if (outStreamCoder.encodeVideo(packet, picture, 0) < 0) {
		throw new RuntimeException("could not encode video"); //$NON-NLS-1$
	}
	if (packet.isComplete()) {
		boolean forceInterleave = true;
		if (outContainer.writePacket(packet, forceInterleave) < 0) {
			throw new RuntimeException("could not save packet to container"); //$NON-NLS-1$
		}
		return true;
	}
	return false;		
}
 
开发者ID:OpenSourcePhysics,项目名称:video-engines,代码行数:29,代码来源:XuggleVideoRecorder.java

示例6: encode

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
@Override
public void encode(BufferedImage image) throws Exception {
    IVideoPicture frame = converter.toPicture(image, (long) (frameNo * deltat));
    frame.setQuality(0);

    movieWriter.encodeVideo(0, frame);
    frameNo++;
}
 
开发者ID:Helioviewer-Project,项目名称:JHelioviewer-SWHV,代码行数:9,代码来源:XuggleExporter.java

示例7: getPictureFromQ

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
private IVideoPicture getPictureFromQ() {
	IVideoPicture result;
	synchronized (pictureQueue) {
		result = pictureQueue.firstValue();
		if(result == null) return null;
		pictureQueue.remove(result.getTimeStamp());
		queueSize.release();
	}
	return result;
}
 
开发者ID:arisona,项目名称:ether,代码行数:11,代码来源:XuggleAccess.java

示例8: getHostImage

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
@Override
public IHostImage getHostImage(BlockingQueue<float[]> audioData) {
	IHostImage result = null;
	try {
		final int w = getWidth();
		final int h = getHeight();
		IVideoPicture newPic = currentPicture.get();
		if (resampler != null) {
			if(tmpPicture == null)
				tmpPicture = IVideoPicture.make(resampler.getOutputPixelFormat(), w, h); 
			newPic = tmpPicture;
			if (resampler.resample(newPic, currentPicture.get()) < 0) {
				log.warning("could not resample video");
				return null;
			}
		}
		if (newPic.getPixelType() != IPixelFormat.Type.RGB24) {
			log.warning("could not decode video as RGB24 bit data");
			return null;
		}
		ByteBuffer dstBuffer = BufferUtils.createByteBuffer(w * h * 3);
		flip(newPic.getByteBuffer(), dstBuffer, w, h);
		result = IHostImage.create(w, h, ComponentType.BYTE, ComponentFormat.RGB, dstBuffer);
		if(!(this.audioData.isEmpty())) {
			while(audioData.size() > (2  * this.audioData.size()) + 128)
				audioData.take();

			while(!(this.audioData.isEmpty())) 
				audioData.add(this.audioData.take());
		}
	} catch(Throwable t) {
		log.warning(t);
	}
	return result;
}
 
开发者ID:arisona,项目名称:ether,代码行数:36,代码来源:XuggleAccess.java

示例9: run

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
public void run() {
	while (doStream) {
		IVideoPicture frame = frames.poll();
		if (frame != null) {
			// mark as keyframe
			//frame.setKeyFrame(i % keyFrameInterval == 0);
			//frame.setQuality(0);
			IPacket packet = IPacket.make();
			try {
				coder.encodeVideo(packet, frame, 0);
				if (packet.isComplete()) {
					container.writePacket(packet);
				}	
			} finally {
				frame.delete();
				if (packet != null) {
					packet.delete();
				}
			}
		} else {
			try {
				Thread.sleep(16L);
			} catch (InterruptedException e) {
			}
		}
	}
	System.out.println("Frame worker exit");
}
 
开发者ID:BigMarker,项目名称:deskshare-public,代码行数:29,代码来源:ScreenCap.java

示例10: recordFrame

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
public void recordFrame(BufferedImage frame) {
	final BufferedImage image = ConverterFactory.convertToType(frame, BufferedImage.TYPE_3BYTE_BGR);
	final IConverter converter = ConverterFactory.createConverter(image, IPixelFormat.Type.YUV420P);

	timestamp = (System.currentTimeMillis() - startTime) + timeOffset;

	final IVideoPicture f = converter.toPicture(image, timestamp * 1000);
	f.setKeyFrame(isFirstShotFrame);
	f.setQuality(0);

	if (forking) {
		synchronized (bufferedFrames) {
			bufferedFrames.add(f);
		}
	} else {
		isFirstShotFrame = false;

		synchronized (videoWriterLock) {
			if (recording)
				videoWriter.encodeVideo(0, f);
		}

		if (timestamp >= ShotRecorder.RECORD_LENGTH * 3) {
			logger.debug("Rolling video file {}, timestamp = {} ms", relativeVideoFile.getPath(), timestamp);
			fork(false);
		}
	}
}
 
开发者ID:phrack,项目名称:ShootOFF,代码行数:29,代码来源:RollingRecorder.java

示例11: recordFrame

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
public void recordFrame(BufferedImage frame) {
	final BufferedImage image = ConverterFactory.convertToType(frame, BufferedImage.TYPE_3BYTE_BGR);
	final IConverter converter = ConverterFactory.createConverter(image, IPixelFormat.Type.YUV420P);

	final long timestamp = (System.currentTimeMillis() - startTime) + timeOffset;

	final IVideoPicture f = converter.toPicture(image, timestamp * 1000);
	f.setKeyFrame(isFirstShotFrame);
	f.setQuality(0);
	isFirstShotFrame = false;

	videoWriter.encodeVideo(0, f);
}
 
开发者ID:phrack,项目名称:ShootOFF,代码行数:14,代码来源:ShotRecorder.java

示例12: loadFirstFrame

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
/**
 * Loads the first frame of the given video and then seeks back to the beginning of the stream.
 * @param container the video container
 * @param videoCoder the video stream coder
 * @return BufferedImage
 * @throws MediaException thrown if an error occurs during decoding
 */
private BufferedImage loadFirstFrame(IContainer container, IStreamCoder videoCoder) throws MediaException {
	// walk through each packet of the container format
	IPacket packet = IPacket.make();
	while (container.readNextPacket(packet) >= 0) {
		// make sure the packet belongs to the stream we care about
		if (packet.getStreamIndex() == videoCoder.getStream().getIndex()) {
			// create a new picture for the video data to be stored in
			IVideoPicture picture = IVideoPicture.make(videoCoder.getPixelType(), videoCoder.getWidth(), videoCoder.getHeight());
			int offset = 0;
			// decode the video
			while (offset < packet.getSize()) {
				int bytesDecoded = videoCoder.decodeVideo(picture, packet, offset);
				if (bytesDecoded < 0) {
					LOGGER.error("No bytes found in container.");
					throw new MediaException();
				}
				offset += bytesDecoded;

				// make sure that we have a full picture from the video first
				if (picture.isComplete()) {
					// convert the picture to an Java buffered image
					BufferedImage target = new BufferedImage(picture.getWidth(), picture.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
					IConverter converter = ConverterFactory.createConverter(target, picture.getPixelType());
					return converter.toImage(picture);
				}
			}
		}
	}
	
	return null;
}
 
开发者ID:wnbittle,项目名称:praisenter,代码行数:39,代码来源:XugglerVideoMediaLoader.java

示例13: initialize

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
/**
 * Initializes the reader thread with the given media.
 * @param container the media container
 * @param videoCoder the media video decoder
 * @param audioCoder the media audio decoder
 * @param audioConversions the flag(s) for any audio conversions to must take place
 */
public void initialize(IContainer container, IStreamCoder videoCoder, IStreamCoder audioCoder, int audioConversions) {
	// assign the local variables
	this.outputWidth = 0;
	this.outputHeight = 0;
	this.videoConversionEnabled = false;
	this.scale = false;
	this.container = container;
	this.videoCoder = videoCoder;
	this.audioCoder = audioCoder;
	this.audioConversions = audioConversions;
	
	// create a packet for reading
	this.packet = IPacket.make();
	
	// create the image converter for the video
	if (videoCoder != null) {
		this.width = this.videoCoder.getWidth();
		this.height = this.videoCoder.getHeight();
		IPixelFormat.Type type = this.videoCoder.getPixelType();
		this.picture = IVideoPicture.make(type, this.width, this.height);
		BufferedImage target = new BufferedImage(this.width, this.height, BufferedImage.TYPE_3BYTE_BGR);
		this.videoConverter = ConverterFactory.createConverter(target, type);
	}
	
	// create a resuable container for the samples
	if (audioCoder != null) {
		this.samples = IAudioSamples.make(1024, this.audioCoder.getChannels());
	}
}
 
开发者ID:wnbittle,项目名称:praisenter,代码行数:37,代码来源:XugglerMediaReaderThread.java

示例14: toImage

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
@Override
public BufferedImage toImage(IVideoPicture picture) {
	// test that the picture is valid
	this.validatePicture(picture);

	// resample as needed
	IVideoPicture resamplePicture = null;
	final AtomicReference<JNIReference> ref = new AtomicReference<JNIReference>(null);
	try {
		if (this.willResample()) {
			resamplePicture = AConverter.resample(picture, this.mToImageResampler);
			picture = resamplePicture;
		}

		// get picture parameters
		final int w = picture.getWidth();
		final int h = picture.getHeight();

		final float[][] r = this.bimg.img.bands.get(0).pixels;
		final float[][] g = this.bimg.img.bands.get(1).pixels;
		final float[][] b = this.bimg.img.bands.get(2).pixels;

		picture.getDataCached().get(0, this.buffer, 0, this.buffer.length);
		for (int y = 0, i = 0; y < h; y++) {
			for (int x = 0; x < w; x++, i += 3) {
				b[y][x] = ImageUtilities.BYTE_TO_FLOAT_LUT[(this.buffer[i] & 0xFF)];
				g[y][x] = ImageUtilities.BYTE_TO_FLOAT_LUT[(this.buffer[i + 1] & 0xFF)];
				r[y][x] = ImageUtilities.BYTE_TO_FLOAT_LUT[(this.buffer[i + 2] & 0xFF)];
			}
		}

		return this.bimg;
	} finally {
		if (resamplePicture != null)
			resamplePicture.delete();
		if (ref.get() != null)
			ref.get().delete();
	}
}
 
开发者ID:openimaj,项目名称:openimaj,代码行数:40,代码来源:XuggleVideo.java

示例15: onVideoPicture

import com.xuggle.xuggler.IVideoPicture; //导入依赖的package包/类
public void onVideoPicture(IVideoPictureEvent event) {
    IVideoPicture picture = event.getMediaData();
    long originalTimeStamp = picture.getTimeStamp();

    // set the new time stamp to the original plus the offset established
    // for this media file

    long newTimeStamp = originalTimeStamp + mOffset;

    // keep track of predicted time of the next video picture, if the end
    // of the media file is encountered, then the offset will be adjusted
    // to this this time.
    //
    // You'll note in the audio samples listener above we used
    // a method called getNextPts().  Video pictures don't have
    // a similar method because frame-rates can be variable, so
    // we don't now.  The minimum thing we do know though (since
    // all media containers require media to have monotonically
    // increasing time stamps), is that the next video timestamp
    // should be at least one tick ahead.  So, we fake it.

    mNextVideo = originalTimeStamp + 1;

    // set the new timestamp on video samples

    picture.setTimeStamp(newTimeStamp);

    // create a new video picture event with the one true video stream
    // index

    super.onVideoPicture(new VideoPictureEvent(this, picture,
            mVideoStreamIndex));
}
 
开发者ID:destiny1020,项目名称:java-learning-notes-cn,代码行数:34,代码来源:ConcatenateAudioAndVideo.java


注:本文中的com.xuggle.xuggler.IVideoPicture类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。