本文整理汇总了Java中org.gearvrf.GVRBitmapTexture类的典型用法代码示例。如果您正苦于以下问题:Java GVRBitmapTexture类的具体用法?Java GVRBitmapTexture怎么用?Java GVRBitmapTexture使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。
GVRBitmapTexture类属于org.gearvrf包,在下文中一共展示了GVRBitmapTexture类的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。
示例1: generate
import org.gearvrf.GVRBitmapTexture; //导入依赖的package包/类
public GVRBitmapTexture generate(GVRContext context, int[] colors, float[] stops) {
Bitmap bitmap = Bitmap.createBitmap(WIDTH, HEIGHT, Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
Shader shader = new LinearGradient(0,0, 0, HEIGHT,
colors, stops,
TileMode.CLAMP);
Paint paint = new Paint();
paint.setShader(shader);
canvas.drawRect(new Rect(0, 0, WIDTH, HEIGHT), paint);
GVRBitmapTexture texture = new GVRBitmapTexture(context, bitmap);
return texture;
}
示例2: setColor
import org.gearvrf.GVRBitmapTexture; //导入依赖的package包/类
public void setColor(int color) {
Bitmap bitmap = Bitmap.createBitmap(32, 32, Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
canvas.drawColor(color);
GVRBitmapTexture texture = new GVRBitmapTexture(mContext, bitmap);
GVRMaterial material = new GVRMaterial(mContext);
material.setMainTexture(texture);
for (int i = 0; i < 2; i++) {
objects[i].getRenderData().setMaterial(material);
}
}
示例3: ExploreVideoScene
import org.gearvrf.GVRBitmapTexture; //导入依赖的package包/类
public ExploreVideoScene(GVRContext gvrContext, List<File> videos) {
super(gvrContext);
getMainCameraRig().getLeftCamera().setBackgroundColor(1f, 1f, 1f, 1f);
getMainCameraRig().getRightCamera().setBackgroundColor(1f, 1f, 1f, 1f);
GVRSceneObject eyeTracker = new GVRSceneObject(gvrContext,
gvrContext.createQuad(0.1f, 0.1f),
gvrContext.loadTexture(new GVRAndroidResource(gvrContext, R.drawable.tracker))
);
eyeTracker.getTransform().setPosition(0f, 0f, -1f);
eyeTracker.getRenderData().setDepthTest(false);
eyeTracker.getRenderData().setRenderingOrder(100000);
getMainCameraRig().addChildObject(eyeTracker);
mPickHandler = new VideoPickHandler();
getEventReceiver().addListener(mPickHandler);
mPicker = new GVRPicker(gvrContext, this);
for (int i = 0; i < videos.size(); i++) {
File video = videos.get(i);
Bitmap thumb = ThumbnailUtils.createVideoThumbnail(
video.getPath(), MediaStore.Video.Thumbnails.MICRO_KIND
);
GVRBitmapTexture texture = new GVRBitmapTexture(gvrContext, thumb);
GVRSceneObject sceneObject = new GVRSceneObject(
gvrContext, THUMB_WIDTH, THUMB_HEIGHT, texture
);
float start = 1f - THUMB_WIDTH * (11 * videos.size() - 1) / 20;
sceneObject.setPickingEnabled(true);
sceneObject.getRenderData().getMaterial().setOpacity(0.5f);
sceneObject.getTransform().setPosition(start + THUMB_WIDTH * i * 1.1f, 0.0f, -3.0f);
mSceneObjects.add(sceneObject);
mVideos.add(video);
// add the scene object to the scene graph
addSceneObject(sceneObject);
}
}