當前位置: 首頁>>代碼示例>>Java>>正文


Java AudioTrack.MODE_STREAM屬性代碼示例

本文整理匯總了Java中android.media.AudioTrack.MODE_STREAM屬性的典型用法代碼示例。如果您正苦於以下問題:Java AudioTrack.MODE_STREAM屬性的具體用法?Java AudioTrack.MODE_STREAM怎麽用?Java AudioTrack.MODE_STREAM使用的例子?那麽, 這裏精選的屬性代碼示例或許可以為您提供幫助。您也可以進一步了解該屬性所在android.media.AudioTrack的用法示例。


在下文中一共展示了AudioTrack.MODE_STREAM屬性的15個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: onCreate

@Override
public void onCreate() {
    super.onCreate();
    mHandler = new Handler();
    fetchAccessToken();

    int outputBufferSize = AudioTrack.getMinBufferSize(16000,
            AudioFormat.CHANNEL_IN_STEREO,
            AudioFormat.ENCODING_PCM_16BIT);

    try {
        mAudioTrack = new AudioTrack(AudioManager.USE_DEFAULT_STREAM_TYPE, 16000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, outputBufferSize, AudioTrack.MODE_STREAM);
        if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
            mAudioTrack.setVolume(DEFAULT_VOLUME);
        }
        mAudioTrack.play();
    }catch (Exception e){
        e.printStackTrace();
    }
}
 
開發者ID:hsavaliya,項目名稱:GoogleAssistantSDK,代碼行數:20,代碼來源:SpeechService.java

示例2: AudioSink

/**
 * Constructor. Will create a new AudioSink.
 *
 * @param packetSize	size of the incoming packets
 * @param sampleRate	sample rate of the audio signal
 */
public AudioSink (int packetSize, int sampleRate) {
	this.packetSize = packetSize;
	this.sampleRate = sampleRate;

	// Create the queues and fill them with
	this.inputQueue = new ArrayBlockingQueue<SamplePacket>(QUEUE_SIZE);
	this.outputQueue = new ArrayBlockingQueue<SamplePacket>(QUEUE_SIZE);
	for (int i = 0; i < QUEUE_SIZE; i++)
		this.outputQueue.offer(new SamplePacket(packetSize));

	// Create an instance of the AudioTrack class:
	int bufferSize = AudioTrack.getMinBufferSize(sampleRate, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
	this.audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate, AudioFormat.CHANNEL_OUT_MONO,
								AudioFormat.ENCODING_PCM_16BIT, bufferSize, AudioTrack.MODE_STREAM);

	// Create the audio filters:
	this.audioFilter1 = FirFilter.createLowPass(2, 1, 1, 0.1f, 0.15f, 30);
	Log.d(LOGTAG,"constructor: created audio filter 1 with " + audioFilter1.getNumberOfTaps() + " Taps.");
	this.audioFilter2 = FirFilter.createLowPass(4, 1, 1, 0.1f, 0.1f, 30);
	Log.d(LOGTAG,"constructor: created audio filter 2 with " + audioFilter2.getNumberOfTaps() + " Taps.");
	this.tmpAudioSamples = new SamplePacket(packetSize);
}
 
開發者ID:takyonxxx,項目名稱:AndroidSdrRtlTuner,代碼行數:28,代碼來源:AudioSink.java

示例3: init_

private void init_(boolean eccEnabled) {
    mEccEncoder = EccInstanceProvider.getEncoder(eccEnabled);
    int minBufferSizeInBytes = AudioTrack.getMinBufferSize(
            RATE,
            AudioFormat.CHANNEL_OUT_MONO,
            AudioFormat.ENCODING_PCM_16BIT);
    // 44.1kHz mono 16bit
    mAudioTrack = new AudioTrack(
            AudioManager.STREAM_MUSIC,
            RATE,
            AudioFormat.CHANNEL_OUT_MONO,
            AudioFormat.ENCODING_PCM_16BIT,
            minBufferSizeInBytes,
            AudioTrack.MODE_STREAM);
    mExecutorService = Executors.newSingleThreadExecutor();
}
 
開發者ID:egglang,項目名稱:sonicky,代碼行數:16,代碼來源:Encoder.java

示例4: PcmPlayer

public PcmPlayer(Context context, Handler handler) {
    this.mContext = context;
    this.audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, wBufferSize, AudioTrack.MODE_STREAM);
    this.handler = handler;
    audioTrack.setPlaybackPositionUpdateListener(this, handler);
    cacheDir = context.getExternalFilesDir(Environment.DIRECTORY_MUSIC);
}
 
開發者ID:LingjuAI,項目名稱:AssistantBySDK,代碼行數:7,代碼來源:PcmPlayer.java

示例5: createAudioTrack

public AudioTrack createAudioTrack(int frameRate) {
    int minBufferSizeBytes = AudioTrack.getMinBufferSize(frameRate,
            AudioFormat.CHANNEL_OUT_STEREO, AudioFormat.ENCODING_PCM_FLOAT);
    Log.i(TAG, "AudioTrack.minBufferSize = " + minBufferSizeBytes
            + " bytes = " + (minBufferSizeBytes / BYTES_PER_FRAME)
            + " frames");
    int bufferSize = 8 * minBufferSizeBytes / 8;
    int outputBufferSizeFrames = bufferSize / BYTES_PER_FRAME;
    Log.i(TAG, "actual bufferSize = " + bufferSize + " bytes = "
            + outputBufferSizeFrames + " frames");

    AudioTrack player = new AudioTrack(AudioManager.STREAM_MUSIC,
            mFrameRate, AudioFormat.CHANNEL_OUT_STEREO,
            AudioFormat.ENCODING_PCM_FLOAT, bufferSize,
            AudioTrack.MODE_STREAM);
    Log.i(TAG, "created AudioTrack");
    return player;
}
 
開發者ID:sdrausty,項目名稱:buildAPKsSamples,代碼行數:18,代碼來源:SimpleAudioOutput.java

示例6: audioTrackInit

@SuppressLint("NewApi")

    private int audioTrackInit(int sampleRateInHz, int channels) {
        //  this.sampleRateInHz=sampleRateInHz;
        //  this.channels=channels;
        //   return 0;

        audioTrackRelease();
        int channelConfig = channels >= 2 ? AudioFormat.CHANNEL_OUT_STEREO : AudioFormat.CHANNEL_OUT_MONO;
        try {
            mAudioTrackBufferSize = AudioTrack.getMinBufferSize(sampleRateInHz, channelConfig, AudioFormat.ENCODING_PCM_16BIT);
            mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRateInHz, channelConfig, AudioFormat.ENCODING_PCM_16BIT, mAudioTrackBufferSize, AudioTrack.MODE_STREAM);
        } catch (Exception e) {
            mAudioTrackBufferSize = 0;
            Log.e("audioTrackInit", e);
        }
        return mAudioTrackBufferSize;
    }
 
開發者ID:WangZhiYao,項目名稱:VideoDemo,代碼行數:18,代碼來源:MediaPlayer.java

示例7: start

/**
 * 設置頻率
 * @param rate
 */
@SuppressWarnings("deprecation")
public void start(int rate){
	stop();
	if(rate>0){
		Hz=rate;
		waveLen = RATE / Hz;
		length = waveLen * Hz;
		audioTrack=new AudioTrack(AudioManager.STREAM_MUSIC, RATE,
				AudioFormat.CHANNEL_CONFIGURATION_STEREO, // CHANNEL_CONFIGURATION_MONO,
				AudioFormat.ENCODING_PCM_8BIT, length, AudioTrack.MODE_STREAM);
		//生成正弦波
		wave=SinWave.sin(wave, waveLen, length);
		if(audioTrack!=null){
			audioTrack.play();
		}
	}else{
		return;
	}
	
}
 
開發者ID:Becavalier,項目名稱:QRDataTransfer-Android,代碼行數:24,代碼來源:AudioTrackManager.java

示例8: AudioThread

public AudioThread(int sampleRateInHz, int channel, long streamId, long decoderId, Media media)
{
	if (channel == 1)
	{
		channel_configuration = AudioFormat.CHANNEL_CONFIGURATION_MONO;
	} else
	{
		channel_configuration = AudioFormat.CHANNEL_CONFIGURATION_STEREO;
	}
	this.mediaStreamId = streamId;
	this.decoderId = decoderId;
	this.media = media;
	int minBufferSize = AudioTrack.getMinBufferSize(sampleRateInHz, channel_configuration, AudioFormat.ENCODING_PCM_16BIT);
	if (minBufferSize > audioLength)
	{
		audioLength = minBufferSize;
	}
	mAudioBuffer = new byte[audioLength];
	mAudio = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRateInHz, channel_configuration, AudioFormat.ENCODING_PCM_16BIT, audioLength, AudioTrack.MODE_STREAM);
}
 
開發者ID:OpenIchano,項目名稱:Viewer,代碼行數:20,代碼來源:AudioThread.java

示例9: initializeAndroidAudio

private void initializeAndroidAudio(int sampleRate) throws Exception {
    int minBufferSize = AudioTrack.getMinBufferSize(sampleRate,
            AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);

    if (minBufferSize < 0) {
        throw new Exception("Failed to get minimum buffer size: "
                + Integer.toString(minBufferSize));
    }

    track = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate,
            AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT,
            minBufferSize, AudioTrack.MODE_STREAM);

}
 
開發者ID:ccfish86,項目名稱:sctalk,代碼行數:14,代碼來源:SpeexDecoder.java

示例10: prepare

@Override
    protected void prepare() throws IOException {
        if (mState < STATE_PREPARED) {
            MediaFormat format;
            if (mState == STATE_UNINITIALIZED) {
                mTrackIndex = selectTrack();
                if (mTrackIndex < 0) {
                    setState(STATE_NO_TRACK_FOUND);
                    return;
                }
                mExtractor.selectTrack(mTrackIndex);
                format = mExtractor.getTrackFormat(mTrackIndex);
                mSampleRate = format.getInteger(MediaFormat.KEY_SAMPLE_RATE);
                int audioChannels = format.getInteger(MediaFormat.KEY_CHANNEL_COUNT);
                mAudioTrack = new AudioTrack(
                        AudioManager.STREAM_MUSIC,
                        mSampleRate,
                        (audioChannels == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO),
                        AudioFormat.ENCODING_PCM_16BIT,
                        AudioTrack.getMinBufferSize(
                                mSampleRate,
                                (audioChannels == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO),
                                AudioFormat.ENCODING_PCM_16BIT
                        ),
                        AudioTrack.MODE_STREAM
                );
                mState = STATE_INITIALIZED;
            } else {
                format = mExtractor.getTrackFormat(mTrackIndex);
            }

            String mime = format.getString(MediaFormat.KEY_MIME);
            Log.d(TAG, mime);
            mMediaCodec = MediaCodec.createDecoderByType(mime);
//            mMediaCodec.setCallback(mCallback);
            mMediaCodec.configure(format, null, null, 0);
            setState(STATE_PREPARED);
        }
        super.prepare();
    }
 
開發者ID:Tai-Kimura,項目名稱:VideoApplication,代碼行數:40,代碼來源:AudioDecoder.java

示例11: audioTrackInit

public int audioTrackInit() {
//	  Log.e("  ffff mediaplayer audiotrackinit start .  sampleRateInHz:=" + sampleRateInHz + " channels:=" + channels );
	    audioTrackRelease();
	    int channelConfig = channels >= 2 ? AudioFormat.CHANNEL_OUT_STEREO : AudioFormat.CHANNEL_OUT_MONO;
	    try {
	      mAudioTrackBufferSize = AudioTrack.getMinBufferSize(sampleRateInHz, channelConfig, AudioFormat.ENCODING_PCM_16BIT);
	      mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRateInHz, channelConfig, AudioFormat.ENCODING_PCM_16BIT, mAudioTrackBufferSize, AudioTrack.MODE_STREAM);
	    } catch (Exception e) {
	      mAudioTrackBufferSize = 0;
	      Log.e("audioTrackInit", e);
	    }
	    return mAudioTrackBufferSize;
	  }
 
開發者ID:Leavessilent,項目名稱:QuanMinTV,代碼行數:13,代碼來源:MediaPlayer.java

示例12: createAudioTrack

private void createAudioTrack() throws InitializationException {
    // The AudioTrack configurations parameters used here, are guaranteed to
    // be supported on all devices.

    // AudioFormat.CHANNEL_OUT_MONO should be used in place of deprecated
    // AudioFormat.CHANNEL_CONFIGURATION_MONO, but it is not available for
    // API level 3.

    // Output buffer for playing should be as short as possible, so
    // AudioBufferPlayed events are not invoked long before audio buffer is
    // actually played. Also, when AudioTrack is stopped, it is filled with
    // silence of length audioTrackBufferSizeInBytes. If the silence is too
    // long, it causes a delay before the next recorded data starts playing.
    audioTrackBufferSizeInBytes = AudioTrack.getMinBufferSize(
            SpeechTrainerConfig.SAMPLE_RATE_HZ,
            AudioFormat.CHANNEL_CONFIGURATION_MONO,
            AudioFormat.ENCODING_PCM_16BIT);
    if (audioTrackBufferSizeInBytes <= 0) {
        throw new InitializationException("Failed to initialize playback.");
    }

    audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
            SpeechTrainerConfig.SAMPLE_RATE_HZ,
            AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT,
            audioTrackBufferSizeInBytes,
            AudioTrack.MODE_STREAM);
    if (audioTrack.getState() != AudioTrack.STATE_INITIALIZED) {
        audioTrack = null;
        throw new InitializationException("Failed to initialize playback.");
    }
}
 
開發者ID:sdrausty,項目名稱:buildAPKsApps,代碼行數:31,代碼來源:ControllerFactory.java

示例13: audioDecoderTest

public void audioDecoderTest(String filePath) throws IOException {
  AudioDecoder audioDecoderThread = new AudioDecoder(this, this);
  audioDecoderThread.initExtractor(filePath);
  audioDecoderThread.prepareAudio();

  int buffsize = AudioTrack.getMinBufferSize(audioDecoderThread.getSampleRate(),
      AudioFormat.CHANNEL_OUT_STEREO, AudioFormat.ENCODING_PCM_16BIT);
  audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, audioDecoderThread.getSampleRate(),
      AudioFormat.CHANNEL_OUT_STEREO, AudioFormat.ENCODING_PCM_16BIT, buffsize,
      AudioTrack.MODE_STREAM);
  audioTrack.play();
  audioDecoderThread.start();
}
 
開發者ID:pedroSG94,項目名稱:rtmp-rtsp-stream-client-java,代碼行數:13,代碼來源:DecodersTest.java

示例14: createAudioTrackOnLollipopOrHigher

@TargetApi(21)
private static AudioTrack createAudioTrackOnLollipopOrHigher(
    int sampleRateInHz, int channelConfig, int bufferSizeInBytes) {
  Logging.d(TAG, "createAudioTrackOnLollipopOrHigher");
  // TODO(henrika): use setPerformanceMode(int) with PERFORMANCE_MODE_LOW_LATENCY to control
  // performance when Android O is supported. Add some logging in the mean time.
  final int nativeOutputSampleRate =
      AudioTrack.getNativeOutputSampleRate(AudioManager.STREAM_VOICE_CALL);
  Logging.d(TAG, "nativeOutputSampleRate: " + nativeOutputSampleRate);
  if (sampleRateInHz != nativeOutputSampleRate) {
    Logging.w(TAG, "Unable to use fast mode since requested sample rate is not native");
  }
  if (usageAttribute != DEFAULT_USAGE) {
    Logging.w(TAG, "A non default usage attribute is used: " + usageAttribute);
  }
  // Create an audio track where the audio usage is for VoIP and the content type is speech.
  return new AudioTrack(
      new AudioAttributes.Builder()
          .setUsage(usageAttribute)
          .setContentType(AudioAttributes.CONTENT_TYPE_SPEECH)
      .build(),
      new AudioFormat.Builder()
        .setEncoding(AudioFormat.ENCODING_PCM_16BIT)
        .setSampleRate(sampleRateInHz)
        .setChannelMask(channelConfig)
        .build(),
      bufferSizeInBytes,
      AudioTrack.MODE_STREAM,
      AudioManager.AUDIO_SESSION_ID_GENERATE);
}
 
開發者ID:Piasy,項目名稱:AppRTC-Android,代碼行數:30,代碼來源:WebRtcAudioTrack.java

示例15: SoundGenerator

public SoundGenerator() {
    // Create the track in streaming mode.
    this.audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
            SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO,
            AudioFormat.ENCODING_PCM_16BIT, NUM_SAMPLES,
            AudioTrack.MODE_STREAM);
    // Call play so the track will start playing when data is written.
    this.audioTrack.play();
}
 
開發者ID:nelladragon,項目名稱:scab,代碼行數:9,代碼來源:SoundGenerator.java


注:本文中的android.media.AudioTrack.MODE_STREAM屬性示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。