├── CONTRIBUTING.md ├── LICENSE.md ├── NativeCode ├── AVHandler.cpp ├── AVHandler.h ├── DX11TextureObject.cpp ├── DX11TextureObject.h ├── DecoderFFmpeg.cpp ├── DecoderFFmpeg.h ├── IDecoder.h ├── ITextureObject.h ├── Logger.cpp ├── Logger.h ├── ViveMediaDecoder.cpp ├── ViveMediaDecoder.def ├── ViveMediaDecoder.h └── include │ └── Unity │ ├── IUnityGraphics.h │ ├── IUnityGraphicsD3D11.h │ └── IUnityInterface.h ├── README.md └── UnityPackage ├── Copyright notice.txt ├── Copyright notice.txt.meta ├── Materials.meta ├── Materials ├── YUV2RGBA.mat └── YUV2RGBA.mat.meta ├── Plugins.meta ├── Plugins ├── x86.meta ├── x86 │ ├── ViveMediaDecoder.dll │ └── ViveMediaDecoder.dll.meta ├── x86_64.meta └── x86_64 │ ├── ViveMediaDecoder.dll │ └── ViveMediaDecoder.dll.meta ├── Scenes.meta ├── Scenes ├── Demo.unity ├── Demo.unity.meta ├── SampleScene.unity └── SampleScene.unity.meta ├── Scripts.meta ├── Scripts ├── PlayerScripts.meta ├── PlayerScripts │ ├── FileSeeker.cs │ ├── FileSeeker.cs.meta │ ├── ImageSourceController.cs │ ├── ImageSourceController.cs.meta │ ├── StereoHandler.cs │ ├── StereoHandler.cs.meta │ ├── StereoImageSourceController.cs │ ├── StereoImageSourceController.cs.meta │ ├── StereoProperty.cs │ ├── StereoProperty.cs.meta │ ├── StereoVideoSourceController.cs │ ├── StereoVideoSourceController.cs.meta │ ├── VideoSourceController.cs │ └── VideoSourceController.cs.meta ├── UVSphere.cs ├── UVSphere.cs.meta ├── ViveMediaDecoder.cs └── ViveMediaDecoder.cs.meta ├── Shaders.meta ├── Shaders ├── YUV2RGBA.shader └── YUV2RGBA.shader.meta ├── config ├── config.meta ├── readme.txt └── readme.txt.meta /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributor License Agreement 2 | This contributor license agreement ("Agreement") describes a contract 3 | between You and HTC Corporation and its affiliates ("HTC") for Your 4 | present and future Contribution (as defined below) in this project (e.g. 5 | submit an issue or a pull request). Please read it carefully before 6 | submitting Your Contribution to the project. By submitting Your 7 | Contribution, You accept this license. If You do not agree to the terms 8 | of this Agreement, please do not submit Your Contribution to this project. 9 | 10 | This version of the Agreement allows a corporate entity ("Corporation") 11 | to submit Contribution to HTC, to authorize Contribution submitted by 12 | its designated employees to HTC, and to grant a license thereto. 13 | 14 | ## 1. Definition 15 | "Contribution" means the code, documentation or any original work of 16 | authorship that is submitted by you to HTC for inclusion in, or 17 | documentation of, the software or any products developed or managed by 18 | HTC ("Work"). 19 | "You" (or "Your") means with respect to the Contribution the owner or 20 | legal entity authorized by the owner that is making this Agreement with 21 | HTC. For legal entities, the entity making a Contribution and all other 22 | entities that control, are controlled by, or are under common control 23 | with that entity are considered to be a single Contributor. For the 24 | purposes of this definition, "control" means (i) the power, direct or 25 | indirect, to cause the direction or management of such entity, whether 26 | by contract or otherwise, or (ii) ownership of fifty percent (50%) or 27 | more of the outstanding shares, or (iii) beneficial ownership of such 28 | entity. 29 | 30 | ## 2. License 31 | Except for the license granted to HTC and recipients of Work, you 32 | reserve all right, title and interest in and to your Contribution. 33 | You grant HTC and its recipients a perpetual, non-exclusive, 34 | irrevocable, royalty-free, worldwide copyright license to make, use, 35 | sell, reproduce, modify, publicly display and perform, distribute 36 | (directly and indirectly), and sublicense your Contribution, and any 37 | derivative works to your Contribution as part of Work. 38 | You acknowledge and agree that based on the grant of rights under this 39 | Agreement, if HTC includes your Contribution in Work, HTC may license 40 | the Contribution under the terms of the license with respect to the Work 41 | between HTC and third parties. 42 | 43 | ## 3. Covenant not to Assert 44 | You irrevocably covenants that you will not assert any of your patent 45 | rights against HTC for Permitted Use, where such covenant not to assert 46 | applies only to those patent claims that are necessarily infringed by 47 | your Contribution alone or by combination of your Contribution with the 48 | Work to which such Contribution(s) was submitted. For the purpose of 49 | this Section, "Permitted Use" means HTC’s making, having made, 50 | reproduction, modification, public display and performance, distribution 51 | or/and sublicense of your Contribution and/or any derivative works to 52 | your Contribution as part of Work. 53 | 54 | ## 4. HTC Right 55 | You acknowledge that HTC is not obligated to use your Contribution as 56 | part of the Work and may decide include any Contribution at HTC’s sole 57 | discretion. 58 | 59 | ## 5. Originality of Work 60 | You warrant and represent that each of Your Contribution is entirely 61 | Your original work. Should you wish to submit a suggestion or work that 62 | is not your original creation, you may submit it to HTC separately from 63 | any Contribution, explicitly identifying the complete details of its 64 | source and of any license or other restrictions of which you are 65 | personally aware. 66 | 67 | ## 6. Representations and Warranties 68 | You warrant and represent that (1) You have the legal authority to enter 69 | into this Agreement; (b) if You are a corporate entity, each employee of 70 | the Corporation designated by You is authorized to submit Contribution 71 | on behalf of the Corporation; (c) each of Contribution is Your original 72 | creation which does not contain any third party’s software (including 73 | but not limited to open source software) and you have the sole and 74 | complete authority to grant this license to HTC; and (d) Your license to 75 | HTC under this Agreement does not require any permission from any third 76 | party and will not violate any grant of rights which you have made to 77 | third parties. 78 | 79 | UNLESS REQUIRED BY APPLICABLE LAW OR EXPRESSLY PROVIDED OTHERWISE IN 80 | THIS AGREEMENT, YOU PROVIDE YOUR CONTRIBUTION ON AN "AS IS" BASIS, 81 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED, 82 | INCLUDING BUT NOT LIMITED TO ANY WARRANTIES OR CONDITIONS OF 83 | MERCHANTABILITY OR FITNESS FOR PARTICULAR PURPOSE. 84 | 85 | ## 7. Notice 86 | You agree to notify HTC of any facts or circumstances of which You 87 | become aware that would make the representations above inaccurate in any 88 | respect. 89 | 90 | For the purpose to maintain the repository, HTC accepts the issues and 91 | pull requests in the following cases: 92 | - Changes that fix bugs. 93 | - New features that fits our plan. 94 | - For other change request or suggestion to the project, please submit 95 | to ViveSoftware@htc.com . -------------------------------------------------------------------------------- /LICENSE.md: -------------------------------------------------------------------------------- 1 | # ViveMediaDecoder License 2 | Copyright (c) 2015-2019 HTC Corporation. All Rights Reserved. 3 | 4 | This document describes a contract between you and HTC Corporation and 5 | its affiliates (collectively “HTC”) for the Works which refer to this 6 | software and corresponding documentation provided by HTC under the terms 7 | of this license. Please read it carefully before downloading or using 8 | this Work. By downloading and/or using this Work, you accept this 9 | license. If you do not agree to the terms of this license, please do not 10 | download or use this Work. 11 | 12 | Unless otherwise provided herein, the information contained in the Work 13 | is the exclusive property of HTC. 14 | 15 | HTC grants you a non-exclusive, non-assignable and royalty-free right 16 | and license to use, modify (if provided in a source code form) or 17 | distribute the Work and any modification you make to the Work, within 18 | the scope of the legitimate development of your software product. The 19 | usage and redistribution of the Work, with or without modification, is 20 | permitted provided that the following conditions are met: 21 | 22 | 1. Redistributions of the Work in a source code form must retain the 23 | above copyright notice, this list of conditions and the following 24 | disclaimer. 25 | 26 | 2. Redistribution of the Work in binary form must reproduce the above 27 | copyrightnotice, this list of conditions and the following disclaimer in 28 | the documentation and/or other materials provided with the distribution. 29 | 30 | 3. Neither HTC nor the names of its licensors or contributors may be 31 | used to endorse or promote products derived from this Work without 32 | specific prior written permission. 33 | 34 | THE WORK IS PROVIDED "AS IS". HTC, ITS LICENSORS AND CONTRIBUTORS 35 | DISCLAIM ALL WARRANTIES WITH RESPECT TO THIS WORK, EITHER EXPRESS OR 36 | IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE WARRANTIES OF 37 | MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. 38 | HTC MAY MAKE CHANGES TO THE WORK, AT ANY TIME WITHOUT NOTICE, BUT IS NOT 39 | OBLIGATED TO SUPPORT, UPDATE OR UPGRADE FOR THE WORK. 40 | 41 | IN NO EVENT SHALL HTC, ITS LICENSORS OR CONTRIBUTORS BE LIABLE FOR ANY 42 | CLAIM, DAMAGES OR OTHER LIABILITY WHATSOEVER (INCLUDING, WITHOUT 43 | LIMITATION, DAMAGES FOR LOSS OF BUSINESS PROFITS, BUSINESS 44 | INTERRUPTIOON, LOSS OF BUSINESS INFORMATION, OR ANY OTHER PECUNIARY 45 | LOSS), WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 46 | FROM, OUT OF OR IN CONNECTION WITH THE WORK, THE USE OR INABILITY TO USE 47 | THE WORK, EVEN IF HTC, ITS LICENSORS OR CONTRIBUTORS HAVE BEEN ADVISED 48 | OF THE POSSIBLITY OF SUCH DAMAGES. -------------------------------------------------------------------------------- /NativeCode/AVHandler.cpp: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #include "AVHandler.h" 4 | #include "DecoderFFmpeg.h" 5 | #include "Logger.h" 6 | 7 | AVHandler::AVHandler() { 8 | mDecoderState = UNINITIALIZED; 9 | mSeekTime = 0.0; 10 | mIDecoder = std::make_unique(); 11 | } 12 | 13 | void AVHandler::init(const char* filePath) { 14 | if (mIDecoder == NULL || !mIDecoder->init(filePath)) { 15 | mDecoderState = INIT_FAIL; 16 | } else { 17 | mDecoderState = INITIALIZED; 18 | } 19 | } 20 | 21 | AVHandler::DecoderState AVHandler::getDecoderState() { 22 | return mDecoderState; 23 | } 24 | 25 | void AVHandler::stopDecoding() { 26 | mDecoderState = STOP; 27 | if (mDecodeThread.joinable()) { 28 | mDecodeThread.join(); 29 | } 30 | 31 | mIDecoder = NULL; 32 | mDecoderState = UNINITIALIZED; 33 | } 34 | 35 | double AVHandler::getVideoFrame(uint8_t** outputY, uint8_t** outputU, uint8_t** outputV) { 36 | if (mIDecoder == NULL || !mIDecoder->getVideoInfo().isEnabled || mDecoderState == SEEK) { 37 | LOG("Video is not available. \n"); 38 | *outputY = *outputU = *outputV = NULL; 39 | return -1; 40 | } 41 | 42 | return mIDecoder->getVideoFrame(outputY, outputU, outputV); 43 | } 44 | 45 | double AVHandler::getAudioFrame(uint8_t** outputFrame, int& frameSize) { 46 | if (mIDecoder == NULL || !mIDecoder->getAudioInfo().isEnabled || mDecoderState == SEEK) { 47 | LOG("Audio is not available. \n"); 48 | *outputFrame = NULL; 49 | return -1; 50 | } 51 | 52 | return mIDecoder->getAudioFrame(outputFrame, frameSize); 53 | } 54 | 55 | void AVHandler::freeVideoFrame() { 56 | if (mIDecoder == NULL || !mIDecoder->getVideoInfo().isEnabled || mDecoderState == SEEK) { 57 | LOG("Video is not available. \n"); 58 | return; 59 | } 60 | 61 | mIDecoder->freeVideoFrame(); 62 | } 63 | 64 | void AVHandler::freeAudioFrame() { 65 | if (mIDecoder == NULL || !mIDecoder->getAudioInfo().isEnabled || mDecoderState == SEEK) { 66 | LOG("Audio is not available. \n"); 67 | return; 68 | } 69 | 70 | mIDecoder->freeAudioFrame(); 71 | } 72 | 73 | void AVHandler::startDecoding() { 74 | if (mIDecoder == NULL || mDecoderState != INITIALIZED) { 75 | LOG("Not initialized, decode thread would not start. \n"); 76 | return; 77 | } 78 | 79 | mDecodeThread = std::thread([&]() { 80 | if (!(mIDecoder->getVideoInfo().isEnabled || mIDecoder->getAudioInfo().isEnabled)) { 81 | LOG("No stream enabled. \n"); 82 | LOG("Decode thread would not start. \n"); 83 | return; 84 | } 85 | 86 | mDecoderState = DECODING; 87 | while (mDecoderState != STOP) { 88 | switch (mDecoderState) { 89 | case DECODING: 90 | if (!mIDecoder->decode()) { 91 | mDecoderState = DECODE_EOF; 92 | } 93 | break; 94 | case SEEK: 95 | mIDecoder->seek(mSeekTime); 96 | mDecoderState = DECODING; 97 | break; 98 | case DECODE_EOF: 99 | break; 100 | } 101 | } 102 | }); 103 | } 104 | 105 | AVHandler::~AVHandler() { 106 | stopDecoding(); 107 | } 108 | 109 | void AVHandler::setSeekTime(float sec) { 110 | if (mDecoderState < INITIALIZED || mDecoderState == SEEK) { 111 | LOG("Seek unavaiable."); 112 | return; 113 | } 114 | 115 | mSeekTime = sec; 116 | mDecoderState = SEEK; 117 | } 118 | 119 | IDecoder::VideoInfo AVHandler::getVideoInfo() { 120 | return mIDecoder->getVideoInfo(); 121 | } 122 | 123 | IDecoder::AudioInfo AVHandler::getAudioInfo() { 124 | return mIDecoder->getAudioInfo(); 125 | } 126 | 127 | bool AVHandler::isVideoBufferEmpty() { 128 | IDecoder::VideoInfo* videoInfo = &(mIDecoder->getVideoInfo()); 129 | IDecoder::BufferState EMPTY = IDecoder::BufferState::EMPTY; 130 | return videoInfo->isEnabled && videoInfo->bufferState == EMPTY; 131 | } 132 | 133 | bool AVHandler::isVideoBufferFull() { 134 | IDecoder::VideoInfo* videoInfo = &(mIDecoder->getVideoInfo()); 135 | IDecoder::BufferState FULL = IDecoder::BufferState::FULL; 136 | return videoInfo->isEnabled && videoInfo->bufferState == FULL; 137 | } 138 | 139 | int AVHandler::getMetaData(char**& key, char**& value) { 140 | if (mIDecoder == NULL ||mDecoderState <= UNINITIALIZED) { 141 | return 0; 142 | } 143 | 144 | return mIDecoder->getMetaData(key, value); 145 | } 146 | 147 | void AVHandler::setVideoEnable(bool isEnable) { 148 | if (mIDecoder == NULL) { 149 | return; 150 | } 151 | 152 | mIDecoder->setVideoEnable(isEnable); 153 | } 154 | 155 | void AVHandler::setAudioEnable(bool isEnable) { 156 | if (mIDecoder == NULL) { 157 | return; 158 | } 159 | 160 | mIDecoder->setAudioEnable(isEnable); 161 | } 162 | 163 | void AVHandler::setAudioAllChDataEnable(bool isEnable) { 164 | if (mIDecoder == NULL) { 165 | return; 166 | } 167 | 168 | mIDecoder->setAudioAllChDataEnable(isEnable); 169 | } -------------------------------------------------------------------------------- /NativeCode/AVHandler.h: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #pragma once 4 | #include "IDecoder.h" 5 | #include 6 | #include 7 | #include 8 | 9 | class AVHandler { 10 | public: 11 | AVHandler(); 12 | ~AVHandler(); 13 | 14 | enum DecoderState { 15 | INIT_FAIL = -1, UNINITIALIZED, INITIALIZED, DECODING, SEEK, BUFFERING, DECODE_EOF, STOP 16 | }; 17 | DecoderState getDecoderState(); 18 | 19 | void init(const char* filePath); 20 | void startDecoding(); 21 | void stopDecoding(); 22 | void setSeekTime(float sec); 23 | 24 | double getVideoFrame(uint8_t** outputY, uint8_t** outputU, uint8_t** outputV); 25 | double getAudioFrame(uint8_t** outputFrame, int& frameSize); 26 | void freeVideoFrame(); 27 | void freeAudioFrame(); 28 | void setVideoEnable(bool isEnable); 29 | void setAudioEnable(bool isEnable); 30 | void setAudioAllChDataEnable(bool isEnable); 31 | 32 | IDecoder::VideoInfo getVideoInfo(); 33 | IDecoder::AudioInfo getAudioInfo(); 34 | bool isVideoBufferEmpty(); 35 | bool isVideoBufferFull(); 36 | 37 | int getMetaData(char**& key, char**& value); 38 | 39 | private: 40 | DecoderState mDecoderState; 41 | std::unique_ptr mIDecoder; 42 | double mSeekTime; 43 | 44 | std::thread mDecodeThread; 45 | }; -------------------------------------------------------------------------------- /NativeCode/DX11TextureObject.cpp: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #include "DX11TextureObject.h" 4 | #include "Logger.h" 5 | #include 6 | 7 | DX11TextureObject::DX11TextureObject() { 8 | mD3D11Device = NULL; 9 | mWidthY = mHeightY = mLengthY = 0; 10 | mWidthUV = mHeightUV = mLengthUV = 0; 11 | 12 | for (int i = 0; i < TEXTURE_NUM; i++) { 13 | mTextures[i] = NULL; 14 | mShaderResourceView[i] = NULL; 15 | } 16 | } 17 | 18 | DX11TextureObject::~DX11TextureObject() { 19 | destroy(); 20 | } 21 | 22 | void DX11TextureObject::getResourcePointers(void*& ptry, void*& ptru, void*& ptrv) { 23 | if (mD3D11Device == NULL) { 24 | return; 25 | } 26 | 27 | ptry = mShaderResourceView[0]; 28 | ptru = mShaderResourceView[1]; 29 | ptrv = mShaderResourceView[2]; 30 | } 31 | 32 | void DX11TextureObject::create(void* handler, unsigned int width, unsigned int height) { 33 | if (handler == NULL) { 34 | return; 35 | } 36 | 37 | mD3D11Device = (ID3D11Device*) handler; 38 | mWidthY = (unsigned int)(ceil((float) width / CPU_ALIGMENT) * CPU_ALIGMENT); 39 | mHeightY = height; 40 | mLengthY = mWidthY * mHeightY; 41 | 42 | mWidthUV = mWidthY / 2; 43 | mHeightUV = mHeightY / 2; 44 | mLengthUV = mWidthUV * mHeightUV; 45 | 46 | // For YUV420 47 | // Y channel 48 | D3D11_TEXTURE2D_DESC texDesc; 49 | ZeroMemory(&texDesc, sizeof(D3D11_TEXTURE2D_DESC)); 50 | texDesc.Width = width; 51 | texDesc.Height = height; 52 | texDesc.MipLevels = texDesc.ArraySize = 1; 53 | texDesc.Format = DXGI_FORMAT_A8_UNORM; 54 | texDesc.SampleDesc.Count = 1; 55 | texDesc.Usage = D3D11_USAGE_DYNAMIC; 56 | texDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE; 57 | texDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE; 58 | texDesc.MiscFlags = 0; 59 | 60 | HRESULT result = mD3D11Device->CreateTexture2D(&texDesc, NULL, (ID3D11Texture2D**)(&(mTextures[0]))); 61 | if (FAILED(result)) { 62 | LOG("Create texture Y fail. Error code: %x\n", result); 63 | } 64 | 65 | D3D11_SHADER_RESOURCE_VIEW_DESC shaderResourceViewDesc; 66 | shaderResourceViewDesc.Format = DXGI_FORMAT_A8_UNORM; 67 | shaderResourceViewDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; 68 | shaderResourceViewDesc.Texture2D.MostDetailedMip = 0; 69 | shaderResourceViewDesc.Texture2D.MipLevels = 1; 70 | 71 | result = mD3D11Device->CreateShaderResourceView((ID3D11Texture2D*)(mTextures[0]), &shaderResourceViewDesc, &(mShaderResourceView[0])); 72 | if (FAILED(result)) { 73 | LOG("Create shader resource view Y fail. Error code: %x\n", result); 74 | } 75 | 76 | // UV channel 77 | texDesc.Width = width / 2; 78 | texDesc.Height = height / 2; 79 | result = mD3D11Device->CreateTexture2D(&texDesc, NULL, (ID3D11Texture2D**)(&(mTextures[1]))); 80 | if (FAILED(result)) { 81 | LOG("Create texture U fail. Error code: %x\n", result); 82 | } 83 | 84 | result = mD3D11Device->CreateShaderResourceView((ID3D11Texture2D*)(mTextures[1]), &shaderResourceViewDesc, &(mShaderResourceView[1])); 85 | if (FAILED(result)) { 86 | LOG("Create shader resource view U fail. Error code: %x\n", result); 87 | } 88 | 89 | result = mD3D11Device->CreateTexture2D(&texDesc, NULL, (ID3D11Texture2D**)(&(mTextures[2]))); 90 | if (FAILED(result)) { 91 | LOG("Create texture V fail. Error code: %x\n", result); 92 | } 93 | 94 | result = mD3D11Device->CreateShaderResourceView((ID3D11Texture2D*)(mTextures[2]), &shaderResourceViewDesc, &(mShaderResourceView[2])); 95 | if (FAILED(result)) { 96 | LOG("Create shader resource view V fail. %x\n", result); 97 | } 98 | } 99 | 100 | void DX11TextureObject::upload(unsigned char* ych, unsigned char* uch, unsigned char* vch) { 101 | if (mD3D11Device == NULL) { 102 | return; 103 | } 104 | 105 | ID3D11DeviceContext* ctx = NULL; 106 | mD3D11Device->GetImmediateContext(&ctx); 107 | 108 | D3D11_MAPPED_SUBRESOURCE mappedResource[TEXTURE_NUM]; 109 | for (int i = 0; i < TEXTURE_NUM; i++) { 110 | ZeroMemory(&(mappedResource[i]), sizeof(D3D11_MAPPED_SUBRESOURCE)); 111 | ctx->Map(mTextures[i], 0, D3D11_MAP_WRITE_DISCARD, 0, &(mappedResource[i])); 112 | } 113 | 114 | // Consider padding. 115 | UINT rowPitchY = mappedResource[0].RowPitch; 116 | UINT rowPitchUV = mappedResource[1].RowPitch; 117 | 118 | uint8_t* ptrMappedY = (uint8_t*)(mappedResource[0].pData); 119 | uint8_t* ptrMappedU = (uint8_t*)(mappedResource[1].pData); 120 | uint8_t* ptrMappedV = (uint8_t*)(mappedResource[2].pData); 121 | 122 | // Two thread memory copy 123 | std::thread YThread = std::thread([&]() { 124 | // Map region has its own row pitch which may different to texture width. 125 | if (mWidthY == rowPitchY) { 126 | memcpy(ptrMappedY, ych, mLengthY); 127 | } 128 | else { 129 | // Handle rowpitch of mapped memory. 130 | uint8_t* end = ych + mLengthY; 131 | while (ych != end) { 132 | memcpy(ptrMappedY, ych, mWidthY); 133 | ych += mWidthY; 134 | ptrMappedY += rowPitchY; 135 | } 136 | } 137 | }); 138 | 139 | std::thread UVThread = std::thread([&]() { 140 | if (mWidthUV == rowPitchUV) { 141 | memcpy(ptrMappedU, uch, mLengthUV); 142 | memcpy(ptrMappedV, vch, mLengthUV); 143 | } 144 | else { 145 | // Handle rowpitch of mapped memory. 146 | // YUV420, length U == length V 147 | uint8_t* endU = uch + mLengthUV; 148 | while (uch != endU) { 149 | memcpy(ptrMappedU, uch, mWidthUV); 150 | memcpy(ptrMappedV, vch, mWidthUV); 151 | uch += mWidthUV; 152 | vch += mWidthUV; 153 | ptrMappedU += rowPitchUV; 154 | ptrMappedV += rowPitchUV; 155 | } 156 | } 157 | }); 158 | 159 | if (YThread.joinable()) { 160 | YThread.join(); 161 | } 162 | if (UVThread.joinable()) { 163 | UVThread.join(); 164 | } 165 | 166 | for (int i = 0; i < TEXTURE_NUM; i++) { 167 | ctx->Unmap(mTextures[i], 0); 168 | } 169 | ctx->Release(); 170 | } 171 | 172 | void DX11TextureObject::destroy() { 173 | mD3D11Device = NULL; 174 | mWidthY = mHeightY = mLengthY = 0; 175 | mWidthUV = mHeightUV = mLengthUV = 0; 176 | 177 | for (int i = 0; i < TEXTURE_NUM; i++) { 178 | if (mTextures[i] != NULL) { 179 | mTextures[i]->Release(); 180 | mTextures[i] = NULL; 181 | } 182 | 183 | if (mShaderResourceView[i] != NULL) { 184 | mShaderResourceView[i]->Release(); 185 | mShaderResourceView[i] = NULL; 186 | } 187 | } 188 | } -------------------------------------------------------------------------------- /NativeCode/DX11TextureObject.h: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #pragma once 4 | #include "ITextureObject.h" 5 | #include 6 | 7 | class DX11TextureObject : public virtual ITextureObject 8 | { 9 | public: 10 | DX11TextureObject(); 11 | ~DX11TextureObject(); 12 | 13 | void create(void* handler, unsigned int width, unsigned int height); 14 | void getResourcePointers(void*& ptry, void*& ptru, void*& ptrv); 15 | void upload(unsigned char* ych, unsigned char* uch, unsigned char* vch); 16 | void destroy(); 17 | 18 | private: 19 | ID3D11Device* mD3D11Device; 20 | 21 | unsigned int mWidthY; 22 | unsigned int mHeightY; 23 | unsigned int mLengthY; 24 | 25 | unsigned int mWidthUV; 26 | unsigned int mHeightUV; 27 | unsigned int mLengthUV; 28 | 29 | ID3D11Texture2D* mTextures[TEXTURE_NUM]; 30 | ID3D11ShaderResourceView* mShaderResourceView[TEXTURE_NUM]; 31 | }; -------------------------------------------------------------------------------- /NativeCode/DecoderFFmpeg.cpp: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #include "DecoderFFmpeg.h" 4 | #include "Logger.h" 5 | #include 6 | #include 7 | 8 | DecoderFFmpeg::DecoderFFmpeg() { 9 | mAVFormatContext = NULL; 10 | mVideoStream = NULL; 11 | mAudioStream = NULL; 12 | mVideoCodec = NULL; 13 | mAudioCodec = NULL; 14 | mVideoCodecContext = NULL; 15 | mAudioCodecContext = NULL; 16 | av_init_packet(&mPacket); 17 | 18 | mSwrContext = NULL; 19 | 20 | mVideoBuffMax = 64; 21 | mAudioBuffMax = 128; 22 | 23 | memset(&mVideoInfo, 0, sizeof(VideoInfo)); 24 | memset(&mAudioInfo, 0, sizeof(AudioInfo)); 25 | mIsInitialized = false; 26 | mIsAudioAllChEnabled = false; 27 | mUseTCP = false; 28 | mIsSeekToAny = false; 29 | } 30 | 31 | DecoderFFmpeg::~DecoderFFmpeg() { 32 | destroy(); 33 | } 34 | 35 | bool DecoderFFmpeg::init(const char* filePath) { 36 | if (mIsInitialized) { 37 | LOG("Decoder has been init. \n"); 38 | return true; 39 | } 40 | 41 | if (filePath == NULL) { 42 | LOG("File path is NULL. \n"); 43 | return false; 44 | } 45 | 46 | av_register_all(); 47 | 48 | if (mAVFormatContext == NULL) { 49 | mAVFormatContext = avformat_alloc_context(); 50 | } 51 | 52 | int errorCode = 0; 53 | errorCode = loadConfig(); 54 | if (errorCode < 0) { 55 | LOG("config loading error. \n"); 56 | LOG("Use default settings. \n"); 57 | mVideoBuffMax = 64; 58 | mAudioBuffMax = 128; 59 | mUseTCP = false; 60 | mIsSeekToAny = false; 61 | } 62 | 63 | AVDictionary* opts = NULL; 64 | if (mUseTCP) { 65 | av_dict_set(&opts, "rtsp_transport", "tcp", 0); 66 | } 67 | 68 | errorCode = avformat_open_input(&mAVFormatContext, filePath, NULL, &opts); 69 | av_dict_free(&opts); 70 | if (errorCode < 0) { 71 | LOG("avformat_open_input error(%x). \n", errorCode); 72 | printErrorMsg(errorCode); 73 | return false; 74 | } 75 | 76 | errorCode = avformat_find_stream_info(mAVFormatContext, NULL); 77 | if (errorCode < 0) { 78 | LOG("avformat_find_stream_info error(%x). \n", errorCode); 79 | printErrorMsg(errorCode); 80 | return false; 81 | } 82 | 83 | double ctxDuration = (double)(mAVFormatContext->duration) / AV_TIME_BASE; 84 | 85 | /* Video initialization */ 86 | int videoStreamIndex = av_find_best_stream(mAVFormatContext, AVMEDIA_TYPE_VIDEO, -1, -1, NULL, 0); 87 | if (videoStreamIndex < 0) { 88 | LOG("video stream not found. \n"); 89 | mVideoInfo.isEnabled = false; 90 | } else { 91 | mVideoInfo.isEnabled = true; 92 | mVideoStream = mAVFormatContext->streams[videoStreamIndex]; 93 | mVideoCodecContext = mVideoStream->codec; 94 | mVideoCodecContext->refcounted_frames = 1; 95 | mVideoCodec = avcodec_find_decoder(mVideoCodecContext->codec_id); 96 | 97 | if (mVideoCodec == NULL) { 98 | LOG("Video codec not available. \n"); 99 | return false; 100 | } 101 | AVDictionary *autoThread = nullptr; 102 | av_dict_set(&autoThread, "threads", "auto", 0); 103 | errorCode = avcodec_open2(mVideoCodecContext, mVideoCodec, &autoThread); 104 | av_dict_free(&autoThread); 105 | if (errorCode < 0) { 106 | LOG("Could not open video codec(%x). \n", errorCode); 107 | printErrorMsg(errorCode); 108 | return false; 109 | } 110 | 111 | // Save the output video format 112 | // Duration / time_base = video time (seconds) 113 | mVideoInfo.width = mVideoCodecContext->width; 114 | mVideoInfo.height = mVideoCodecContext->height; 115 | mVideoInfo.totalTime = mVideoStream->duration <= 0 ? ctxDuration : mVideoStream->duration * av_q2d(mVideoStream->time_base); 116 | 117 | mVideoFrames.swap(decltype(mVideoFrames)()); 118 | } 119 | 120 | /* Audio initialization */ 121 | int audioStreamIndex = av_find_best_stream(mAVFormatContext, AVMEDIA_TYPE_AUDIO, -1, -1, NULL, 0); 122 | if (audioStreamIndex < 0) { 123 | LOG("audio stream not found. \n"); 124 | mAudioInfo.isEnabled = false; 125 | } else { 126 | mAudioInfo.isEnabled = true; 127 | mAudioStream = mAVFormatContext->streams[audioStreamIndex]; 128 | mAudioCodecContext = mAudioStream->codec; 129 | mAudioCodec = avcodec_find_decoder(mAudioCodecContext->codec_id); 130 | 131 | if (mAudioCodec == NULL) { 132 | LOG("Audio codec not available. \n"); 133 | return false; 134 | } 135 | 136 | errorCode = avcodec_open2(mAudioCodecContext, mAudioCodec, NULL); 137 | if (errorCode < 0) { 138 | LOG("Could not open audio codec(%x). \n", errorCode); 139 | printErrorMsg(errorCode); 140 | return false; 141 | } 142 | 143 | errorCode = initSwrContext(); 144 | if (errorCode < 0) { 145 | LOG("Init SwrContext error.(%x) \n", errorCode); 146 | printErrorMsg(errorCode); 147 | return false; 148 | } 149 | 150 | mAudioFrames.swap(decltype(mAudioFrames)()); 151 | } 152 | 153 | mIsInitialized = true; 154 | 155 | return true; 156 | } 157 | 158 | bool DecoderFFmpeg::decode() { 159 | if (!mIsInitialized) { 160 | LOG("Not initialized. \n"); 161 | return false; 162 | } 163 | 164 | if (!isBuffBlocked()) { 165 | if (av_read_frame(mAVFormatContext, &mPacket) < 0) { 166 | updateVideoFrame(); 167 | LOG("End of file.\n"); 168 | return false; 169 | } 170 | 171 | if (mVideoInfo.isEnabled && mPacket.stream_index == mVideoStream->index) { 172 | updateVideoFrame(); 173 | } else if (mAudioInfo.isEnabled && mPacket.stream_index == mAudioStream->index) { 174 | updateAudioFrame(); 175 | } 176 | 177 | av_packet_unref(&mPacket); 178 | } 179 | 180 | return true; 181 | } 182 | 183 | IDecoder::VideoInfo DecoderFFmpeg::getVideoInfo() { 184 | return mVideoInfo; 185 | } 186 | 187 | IDecoder::AudioInfo DecoderFFmpeg::getAudioInfo() { 188 | return mAudioInfo; 189 | } 190 | 191 | void DecoderFFmpeg::setVideoEnable(bool isEnable) { 192 | if (mVideoStream == NULL) { 193 | LOG("Video stream not found. \n"); 194 | return; 195 | } 196 | 197 | mVideoInfo.isEnabled = isEnable; 198 | } 199 | 200 | void DecoderFFmpeg::setAudioEnable(bool isEnable) { 201 | if (mAudioStream == NULL) { 202 | LOG("Audio stream not found. \n"); 203 | return; 204 | } 205 | 206 | mAudioInfo.isEnabled = isEnable; 207 | } 208 | 209 | void DecoderFFmpeg::setAudioAllChDataEnable(bool isEnable) { 210 | mIsAudioAllChEnabled = isEnable; 211 | initSwrContext(); 212 | } 213 | 214 | int DecoderFFmpeg::initSwrContext() { 215 | if (mAudioCodecContext == NULL) { 216 | LOG("Audio context is null. \n"); 217 | return -1; 218 | } 219 | 220 | int errorCode = 0; 221 | int64_t inChannelLayout = av_get_default_channel_layout(mAudioCodecContext->channels); 222 | uint64_t outChannelLayout = mIsAudioAllChEnabled ? inChannelLayout : AV_CH_LAYOUT_STEREO; 223 | AVSampleFormat inSampleFormat = mAudioCodecContext->sample_fmt; 224 | AVSampleFormat outSampleFormat = AV_SAMPLE_FMT_FLT; 225 | int inSampleRate = mAudioCodecContext->sample_rate; 226 | int outSampleRate = inSampleRate; 227 | 228 | if (mSwrContext != NULL) { 229 | swr_close(mSwrContext); 230 | swr_free(&mSwrContext); 231 | mSwrContext = NULL; 232 | } 233 | 234 | mSwrContext = swr_alloc_set_opts(NULL, 235 | outChannelLayout, outSampleFormat, outSampleRate, 236 | inChannelLayout, inSampleFormat, inSampleRate, 237 | 0, NULL); 238 | 239 | 240 | if (swr_is_initialized(mSwrContext) == 0) { 241 | errorCode = swr_init(mSwrContext); 242 | } 243 | 244 | // Save the output audio format 245 | mAudioInfo.channels = av_get_channel_layout_nb_channels(outChannelLayout); 246 | mAudioInfo.sampleRate = outSampleRate; 247 | mAudioInfo.totalTime = mAudioStream->duration <= 0 ? (double)(mAVFormatContext->duration) / AV_TIME_BASE : mAudioStream->duration * av_q2d(mAudioStream->time_base); 248 | 249 | return errorCode; 250 | } 251 | 252 | double DecoderFFmpeg::getVideoFrame(unsigned char** outputY, unsigned char** outputU, unsigned char** outputV) { 253 | std::lock_guard lock(mVideoMutex); 254 | 255 | if (!mIsInitialized || mVideoFrames.size() == 0) { 256 | LOG("Video frame not available. \n"); 257 | *outputY = *outputU = *outputV = NULL; 258 | return -1; 259 | } 260 | 261 | AVFrame* frame = mVideoFrames.front(); 262 | *outputY = frame->data[0]; 263 | *outputU = frame->data[1]; 264 | *outputV = frame->data[2]; 265 | 266 | int64_t timeStamp = av_frame_get_best_effort_timestamp(frame); 267 | double timeInSec = av_q2d(mVideoStream->time_base) * timeStamp; 268 | mVideoInfo.lastTime = timeInSec; 269 | 270 | return timeInSec; 271 | } 272 | 273 | double DecoderFFmpeg::getAudioFrame(unsigned char** outputFrame, int& frameSize) { 274 | std::lock_guard lock(mAudioMutex); 275 | if (!mIsInitialized || mAudioFrames.size() == 0) { 276 | LOG("Audio frame not available. \n"); 277 | *outputFrame = NULL; 278 | return -1; 279 | } 280 | 281 | AVFrame* frame = mAudioFrames.front(); 282 | *outputFrame = frame->data[0]; 283 | frameSize = frame->nb_samples; 284 | int64_t timeStamp = av_frame_get_best_effort_timestamp(frame); 285 | double timeInSec = av_q2d(mAudioStream->time_base) * timeStamp; 286 | mAudioInfo.lastTime = timeInSec; 287 | 288 | return timeInSec; 289 | } 290 | 291 | void DecoderFFmpeg::seek(double time) { 292 | if (!mIsInitialized) { 293 | LOG("Not initialized. \n"); 294 | return; 295 | } 296 | 297 | uint64_t timeStamp = (uint64_t) time * AV_TIME_BASE; 298 | 299 | if (0 > av_seek_frame(mAVFormatContext, -1, timeStamp, mIsSeekToAny ? AVSEEK_FLAG_ANY : AVSEEK_FLAG_BACKWARD)) { 300 | LOG("Seek time fail.\n"); 301 | return; 302 | } 303 | 304 | if (mVideoInfo.isEnabled) { 305 | if (mVideoCodecContext != NULL) { 306 | avcodec_flush_buffers(mVideoCodecContext); 307 | } 308 | flushBuffer(&mVideoFrames, &mVideoMutex); 309 | mVideoInfo.lastTime = -1; 310 | } 311 | 312 | if (mAudioInfo.isEnabled) { 313 | if (mAudioCodecContext != NULL) { 314 | avcodec_flush_buffers(mAudioCodecContext); 315 | } 316 | flushBuffer(&mAudioFrames, &mAudioMutex); 317 | mAudioInfo.lastTime = -1; 318 | } 319 | } 320 | 321 | int DecoderFFmpeg::getMetaData(char**& key, char**& value) { 322 | if (!mIsInitialized || key != NULL || value != NULL) { 323 | return 0; 324 | } 325 | 326 | AVDictionaryEntry *tag = NULL; 327 | int metaCount = av_dict_count(mAVFormatContext->metadata); 328 | 329 | key = (char**)malloc(sizeof(char*) * metaCount); 330 | value = (char**)malloc(sizeof(char*) * metaCount); 331 | 332 | for (int i = 0; i < metaCount; i++) { 333 | tag = av_dict_get(mAVFormatContext->metadata, "", tag, AV_DICT_IGNORE_SUFFIX); 334 | key[i] = tag->key; 335 | value[i] = tag->value; 336 | } 337 | 338 | return metaCount; 339 | } 340 | 341 | void DecoderFFmpeg::destroy() { 342 | if (mVideoCodecContext != NULL) { 343 | avcodec_close(mVideoCodecContext); 344 | mVideoCodecContext = NULL; 345 | } 346 | 347 | if (mAudioCodecContext != NULL) { 348 | avcodec_close(mAudioCodecContext); 349 | mAudioCodecContext = NULL; 350 | } 351 | 352 | if (mAVFormatContext != NULL) { 353 | avformat_close_input(&mAVFormatContext); 354 | avformat_free_context(mAVFormatContext); 355 | mAVFormatContext = NULL; 356 | } 357 | 358 | if (mSwrContext != NULL) { 359 | swr_close(mSwrContext); 360 | swr_free(&mSwrContext); 361 | mSwrContext = NULL; 362 | } 363 | 364 | flushBuffer(&mVideoFrames, &mVideoMutex); 365 | flushBuffer(&mAudioFrames, &mAudioMutex); 366 | 367 | mVideoCodec = NULL; 368 | mAudioCodec = NULL; 369 | 370 | mVideoStream = NULL; 371 | mAudioStream = NULL; 372 | av_packet_unref(&mPacket); 373 | 374 | memset(&mVideoInfo, 0, sizeof(VideoInfo)); 375 | memset(&mAudioInfo, 0, sizeof(AudioInfo)); 376 | 377 | mIsInitialized = false; 378 | mIsAudioAllChEnabled = false; 379 | mVideoBuffMax = 64; 380 | mAudioBuffMax = 128; 381 | mUseTCP = false; 382 | mIsSeekToAny = false; 383 | } 384 | 385 | bool DecoderFFmpeg::isBuffBlocked() { 386 | bool ret = false; 387 | if (mVideoInfo.isEnabled && mVideoFrames.size() >= mVideoBuffMax) { 388 | ret = true; 389 | } 390 | 391 | if (mAudioInfo.isEnabled && mAudioFrames.size() >= mAudioBuffMax) { 392 | ret = true; 393 | } 394 | 395 | return ret; 396 | } 397 | 398 | void DecoderFFmpeg::updateVideoFrame() { 399 | int isFrameAvailable = 0; 400 | AVFrame* frame = av_frame_alloc(); 401 | clock_t start = clock(); 402 | if (avcodec_decode_video2(mVideoCodecContext, frame, &isFrameAvailable, &mPacket) < 0) { 403 | LOG("Error processing data. \n"); 404 | return; 405 | } 406 | LOG("updateVideoFrame = %f\n", (float)(clock() - start) / CLOCKS_PER_SEC); 407 | 408 | if (isFrameAvailable) { 409 | std::lock_guard lock(mVideoMutex); 410 | mVideoFrames.push(frame); 411 | updateBufferState(); 412 | } 413 | } 414 | 415 | void DecoderFFmpeg::updateAudioFrame() { 416 | int isFrameAvailable = 0; 417 | AVFrame* frameDecoded = av_frame_alloc(); 418 | if (avcodec_decode_audio4(mAudioCodecContext, frameDecoded, &isFrameAvailable, &mPacket) < 0) { 419 | LOG("Error processing data. \n"); 420 | return; 421 | } 422 | 423 | AVFrame* frame = av_frame_alloc(); 424 | frame->sample_rate = frameDecoded->sample_rate; 425 | frame->channel_layout = av_get_default_channel_layout(mAudioInfo.channels); 426 | frame->format = AV_SAMPLE_FMT_FLT; // For Unity format. 427 | frame->best_effort_timestamp = frameDecoded->best_effort_timestamp; 428 | swr_convert_frame(mSwrContext, frame, frameDecoded); 429 | 430 | std::lock_guard lock(mAudioMutex); 431 | mAudioFrames.push(frame); 432 | updateBufferState(); 433 | av_frame_free(&frameDecoded); 434 | } 435 | 436 | void DecoderFFmpeg::freeVideoFrame() { 437 | freeFrontFrame(&mVideoFrames, &mVideoMutex); 438 | } 439 | 440 | void DecoderFFmpeg::freeAudioFrame() { 441 | freeFrontFrame(&mAudioFrames, &mAudioMutex); 442 | } 443 | 444 | void DecoderFFmpeg::freeFrontFrame(std::queue* frameBuff, std::mutex* mutex) { 445 | std::lock_guard lock(*mutex); 446 | if (!mIsInitialized || frameBuff->size() == 0) { 447 | LOG("Not initialized or buffer empty. \n"); 448 | return; 449 | } 450 | 451 | AVFrame* frame = frameBuff->front(); 452 | av_frame_free(&frame); 453 | frameBuff->pop(); 454 | updateBufferState(); 455 | } 456 | 457 | // frameBuff.clear would only clean the pointer rather than whole resources. So we need to clear frameBuff by ourself. 458 | void DecoderFFmpeg::flushBuffer(std::queue* frameBuff, std::mutex* mutex) { 459 | std::lock_guard lock(*mutex); 460 | while (!frameBuff->empty()) { 461 | av_frame_free(&(frameBuff->front())); 462 | frameBuff->pop(); 463 | } 464 | } 465 | 466 | // Record buffer state either FULL or EMPTY. It would be considered by ViveMediaDecoder.cs for buffering judgement. 467 | void DecoderFFmpeg::updateBufferState() { 468 | if (mVideoInfo.isEnabled) { 469 | if (mVideoFrames.size() >= mVideoBuffMax) { 470 | mVideoInfo.bufferState = BufferState::FULL; 471 | } else if(mVideoFrames.size() == 0) { 472 | mVideoInfo.bufferState = BufferState::EMPTY; 473 | } else { 474 | mVideoInfo.bufferState = BufferState::NORMAL; 475 | } 476 | } 477 | 478 | if (mAudioInfo.isEnabled) { 479 | if (mAudioFrames.size() >= mAudioBuffMax) { 480 | mAudioInfo.bufferState = BufferState::FULL; 481 | } else if (mAudioFrames.size() == 0) { 482 | mAudioInfo.bufferState = BufferState::EMPTY; 483 | } else { 484 | mAudioInfo.bufferState = BufferState::NORMAL; 485 | } 486 | } 487 | } 488 | 489 | int DecoderFFmpeg::loadConfig() { 490 | std::ifstream configFile("config", std::ifstream::in); 491 | if (!configFile) { 492 | LOG("config does not exist.\n"); 493 | return -1; 494 | } 495 | 496 | enum CONFIG { NONE, USE_TCP, BUFF_MIN, BUFF_MAX }; 497 | int buffVideoMax = 0, buffAudioMax = 0, tcp = 0, seekAny = 0; 498 | std::string line; 499 | while (configFile >> line) { 500 | std::string token = line.substr(0, line.find("=")); 501 | CONFIG config = NONE; 502 | std::string value = line.substr(line.find("=") + 1); 503 | try { 504 | if (token == "USE_TCP") { tcp = stoi(value); } 505 | else if (token == "BUFF_VIDEO_MAX") { buffVideoMax = stoi(value); } 506 | else if (token == "BUFF_AUDIO_MAX") { buffAudioMax = stoi(value); } 507 | else if (token == "SEEK_ANY") { seekAny = stoi(value); } 508 | 509 | } catch (...) { 510 | return -1; 511 | } 512 | } 513 | 514 | mUseTCP = tcp != 0; 515 | mVideoBuffMax = buffVideoMax; 516 | mAudioBuffMax = buffAudioMax; 517 | mIsSeekToAny = seekAny != 0; 518 | LOG("config loading success.\n"); 519 | LOG("USE_TCP=%s\n", mUseTCP ? "true" : "false"); 520 | LOG("BUFF_VIDEO_MAX=%d\n", mVideoBuffMax); 521 | LOG("BUFF_AUDIO_MAX=%d\n", mAudioBuffMax); 522 | LOG("SEEK_ANY=%s\n", mIsSeekToAny ? "true" : "false"); 523 | 524 | return 0; 525 | } 526 | 527 | void DecoderFFmpeg::printErrorMsg(int errorCode) { 528 | char msg[500]; 529 | av_strerror(errorCode, msg, sizeof(msg)); 530 | LOG("Error massage: %s \n", msg); 531 | } -------------------------------------------------------------------------------- /NativeCode/DecoderFFmpeg.h: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #pragma once 4 | #include "IDecoder.h" 5 | #include 6 | #include 7 | 8 | extern "C" { 9 | #include 10 | #include 11 | } 12 | 13 | class DecoderFFmpeg : public virtual IDecoder 14 | { 15 | public: 16 | DecoderFFmpeg(); 17 | ~DecoderFFmpeg(); 18 | 19 | bool init(const char* filePath); 20 | bool decode(); 21 | void seek(double time); 22 | void destroy(); 23 | 24 | VideoInfo getVideoInfo(); 25 | AudioInfo getAudioInfo(); 26 | void setVideoEnable(bool isEnable); 27 | void setAudioEnable(bool isEnable); 28 | void setAudioAllChDataEnable(bool isEnable); 29 | double getVideoFrame(unsigned char** outputY, unsigned char** outputU, unsigned char** outputV); 30 | double getAudioFrame(unsigned char** outputFrame, int& frameSize); 31 | void freeVideoFrame(); 32 | void freeAudioFrame(); 33 | 34 | int getMetaData(char**& key, char**& value); 35 | 36 | private: 37 | bool mIsInitialized; 38 | bool mIsAudioAllChEnabled; 39 | bool mUseTCP; // For RTSP stream. 40 | 41 | AVFormatContext* mAVFormatContext; 42 | AVStream* mVideoStream; 43 | AVStream* mAudioStream; 44 | AVCodec* mVideoCodec; 45 | AVCodec* mAudioCodec; 46 | AVCodecContext* mVideoCodecContext; 47 | AVCodecContext* mAudioCodecContext; 48 | 49 | AVPacket mPacket; 50 | std::queue mVideoFrames; 51 | std::queue mAudioFrames; 52 | unsigned int mVideoBuffMax; 53 | unsigned int mAudioBuffMax; 54 | 55 | SwrContext* mSwrContext; 56 | int initSwrContext(); 57 | 58 | VideoInfo mVideoInfo; 59 | AudioInfo mAudioInfo; 60 | void updateBufferState(); 61 | 62 | int mFrameBufferNum; 63 | 64 | bool isBuffBlocked(); 65 | void updateVideoFrame(); 66 | void updateAudioFrame(); 67 | void freeFrontFrame(std::queue* frameBuff, std::mutex* mutex); 68 | void flushBuffer(std::queue* frameBuff, std::mutex* mutex); 69 | std::mutex mVideoMutex; 70 | std::mutex mAudioMutex; 71 | 72 | bool mIsSeekToAny; 73 | 74 | int loadConfig(); 75 | void printErrorMsg(int errorCode); 76 | }; -------------------------------------------------------------------------------- /NativeCode/IDecoder.h: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #pragma once 4 | 5 | class IDecoder 6 | { 7 | public: 8 | virtual ~IDecoder() {} 9 | 10 | enum BufferState {EMPTY, NORMAL, FULL}; 11 | 12 | struct VideoInfo { 13 | bool isEnabled; 14 | int width; 15 | int height; 16 | double lastTime; 17 | double totalTime; 18 | BufferState bufferState; 19 | }; 20 | 21 | struct AudioInfo { 22 | bool isEnabled; 23 | unsigned int channels; 24 | unsigned int sampleRate; 25 | double lastTime; 26 | double totalTime; 27 | BufferState bufferState; 28 | }; 29 | 30 | virtual bool init(const char* filePath) = 0; 31 | virtual bool decode() = 0; 32 | virtual void seek(double time) = 0; 33 | virtual void destroy() = 0; 34 | 35 | virtual VideoInfo getVideoInfo() = 0; 36 | virtual AudioInfo getAudioInfo() = 0; 37 | virtual void setVideoEnable(bool isEnable) = 0; 38 | virtual void setAudioEnable(bool isEnable) = 0; 39 | virtual void setAudioAllChDataEnable(bool isEnable) = 0; 40 | virtual double getVideoFrame(unsigned char** outputY, unsigned char** outputU, unsigned char** outputV) = 0; 41 | virtual double getAudioFrame(unsigned char** outputFrame, int& frameSize) = 0; 42 | virtual void freeVideoFrame() = 0; 43 | virtual void freeAudioFrame() = 0; 44 | 45 | virtual int getMetaData(char**& key, char**& value) = 0; 46 | }; -------------------------------------------------------------------------------- /NativeCode/ITextureObject.h: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #pragma once 4 | 5 | class ITextureObject { 6 | public: 7 | virtual ~ITextureObject() {} 8 | virtual void create(void* handler, unsigned int width, unsigned int height) = 0; 9 | virtual void getResourcePointers(void*& ptry, void*& ptru, void*& ptrv) = 0; 10 | virtual void upload(unsigned char* ych, unsigned char* uch, unsigned char* vch) = 0; 11 | virtual void destroy() = 0; 12 | 13 | static const unsigned int CPU_ALIGMENT = 64; 14 | static const unsigned int TEXTURE_NUM = 3; 15 | }; -------------------------------------------------------------------------------- /NativeCode/Logger.cpp: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #include "Logger.h" 4 | 5 | #pragma warning(disable:4996) 6 | 7 | Logger* Logger::_instance; 8 | Logger::Logger() { 9 | fclose(stdout); 10 | freopen("NativeLog.txt", "a", stdout); 11 | } 12 | 13 | Logger* Logger::instance() { 14 | if (!_instance) { 15 | _instance = new Logger(); 16 | } 17 | return _instance; 18 | } 19 | 20 | void Logger::log(const char* str, ...) { 21 | va_list args; 22 | va_start(args, str); 23 | vprintf(str, args); 24 | va_end(args); 25 | 26 | fflush(stdout); 27 | } -------------------------------------------------------------------------------- /NativeCode/Logger.h: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #pragma once 4 | #include 5 | #include 6 | 7 | //#define ENABLE_LOG 8 | #ifdef ENABLE_LOG 9 | #define LOG(...) Logger::instance()->log(__VA_ARGS__) 10 | #else 11 | #define LOG 12 | #endif 13 | 14 | class Logger { 15 | public: 16 | static Logger* instance(); 17 | void log(const char* str, ...); 18 | private: 19 | Logger(); 20 | static Logger* _instance; 21 | }; -------------------------------------------------------------------------------- /NativeCode/ViveMediaDecoder.cpp: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #include "Unity\IUnityGraphics.h" 4 | #include "ViveMediaDecoder.h" 5 | #include "AVHandler.h" 6 | #include "Logger.h" 7 | #include "DX11TextureObject.h" 8 | #include 9 | #include 10 | #include 11 | #include 12 | #include 13 | 14 | using namespace std; 15 | 16 | typedef struct _VideoContext { 17 | int id = -1; 18 | string path = ""; 19 | thread initThread; 20 | shared_ptr avhandler = NULL; 21 | unique_ptr textureObj = NULL; 22 | float progressTime = 0.0f; 23 | float lastUpdateTime = -1.0f; 24 | bool isContentReady = false; // This flag is used to indicate the period that seek over until first data is got. 25 | // Usually used for AV sync problem, in pure audio case, it should be discard. 26 | } VideoContext; 27 | 28 | ID3D11Device* g_D3D11Device = NULL; 29 | list> videoContexts; 30 | typedef list>::iterator VideoContextIter; 31 | // -------------------------------------------------------------------------- 32 | static IUnityInterfaces* s_UnityInterfaces = NULL; 33 | static IUnityGraphics* s_Graphics = NULL; 34 | static UnityGfxRenderer s_DeviceType = kUnityGfxRendererNull; 35 | static void DoEventGraphicsDeviceD3D11(UnityGfxDeviceEventType eventType) 36 | { 37 | if (eventType == kUnityGfxDeviceEventInitialize) 38 | { 39 | IUnityGraphicsD3D11* d3d11 = s_UnityInterfaces->Get(); 40 | g_D3D11Device = d3d11->GetDevice(); 41 | } 42 | } 43 | 44 | static void UNITY_INTERFACE_API OnGraphicsDeviceEvent(UnityGfxDeviceEventType eventType) 45 | { 46 | UnityGfxRenderer currentDeviceType = s_DeviceType; 47 | switch (eventType) 48 | { 49 | case kUnityGfxDeviceEventInitialize: 50 | { 51 | s_DeviceType = s_Graphics->GetRenderer(); 52 | currentDeviceType = s_DeviceType; 53 | break; 54 | } 55 | 56 | case kUnityGfxDeviceEventShutdown: 57 | s_DeviceType = kUnityGfxRendererNull; 58 | break; 59 | 60 | case kUnityGfxDeviceEventBeforeReset: 61 | break; 62 | 63 | case kUnityGfxDeviceEventAfterReset: 64 | break; 65 | }; 66 | 67 | #if SUPPORT_D3D11 68 | if (currentDeviceType == kUnityGfxRendererD3D11) 69 | DoEventGraphicsDeviceD3D11(eventType); 70 | #endif 71 | } 72 | 73 | extern "C" void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API UnityPluginLoad(IUnityInterfaces* unityInterfaces) 74 | { 75 | s_UnityInterfaces = unityInterfaces; 76 | s_Graphics = s_UnityInterfaces->Get(); 77 | s_Graphics->RegisterDeviceEventCallback(OnGraphicsDeviceEvent); 78 | 79 | // Run OnGraphicsDeviceEvent(initialize) manually on plugin load 80 | OnGraphicsDeviceEvent(kUnityGfxDeviceEventInitialize); 81 | } 82 | 83 | extern "C" void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API UnityPluginUnload() 84 | { 85 | s_Graphics->UnregisterDeviceEventCallback(OnGraphicsDeviceEvent); 86 | } 87 | // -------------------------------------------------------------------------- 88 | 89 | bool getVideoContext(int id, shared_ptr& videoCtx) { 90 | for (VideoContextIter it = videoContexts.begin(); it != videoContexts.end(); it++) { 91 | if ((*it)->id == id) { 92 | videoCtx = *it; 93 | return true; 94 | } 95 | } 96 | 97 | LOG("Decoder does not exist. \n"); 98 | return false; 99 | } 100 | 101 | void removeVideoContext(int id) { 102 | for (VideoContextIter it = videoContexts.begin(); it != videoContexts.end(); it++) { 103 | if ((*it)->id == id) { 104 | videoContexts.erase(it); 105 | return; 106 | } 107 | } 108 | } 109 | 110 | void DoRendering(int id); 111 | 112 | static void UNITY_INTERFACE_API OnRenderEvent(int eventID) 113 | { 114 | // Unknown graphics device type? Do nothing. 115 | if (s_DeviceType == -1) 116 | return; 117 | 118 | // Actual functions defined below 119 | DoRendering(eventID); 120 | } 121 | 122 | extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API GetRenderEventFunc() 123 | { 124 | return OnRenderEvent; 125 | } 126 | 127 | void DoRendering (int id) 128 | { 129 | if (s_DeviceType == kUnityGfxRendererD3D11 && g_D3D11Device != NULL) 130 | { 131 | ID3D11DeviceContext* ctx = NULL; 132 | g_D3D11Device->GetImmediateContext (&ctx); 133 | 134 | shared_ptr localVideoContext; 135 | if (getVideoContext(id, localVideoContext)) { 136 | AVHandler* localAVHandler = localVideoContext->avhandler.get(); 137 | 138 | if (localAVHandler != NULL && localAVHandler->getDecoderState() >= AVHandler::DecoderState::INITIALIZED && localAVHandler->getVideoInfo().isEnabled) { 139 | if (localVideoContext->textureObj == NULL) { 140 | unsigned int width = localAVHandler->getVideoInfo().width; 141 | unsigned int height = localAVHandler->getVideoInfo().height; 142 | localVideoContext->textureObj = make_unique(); 143 | localVideoContext->textureObj->create(g_D3D11Device, width, height); 144 | } 145 | 146 | double videoDecCurTime = localAVHandler->getVideoInfo().lastTime; 147 | LOG("videoDecCurTime = %f \n", videoDecCurTime); 148 | if (videoDecCurTime <= localVideoContext->progressTime) { 149 | uint8_t* ptrY = NULL; 150 | uint8_t* ptrU = NULL; 151 | uint8_t* ptrV = NULL; 152 | double curFrameTime = localAVHandler->getVideoFrame(&ptrY, &ptrU, &ptrV); 153 | if (ptrY != NULL && curFrameTime != -1 && localVideoContext->lastUpdateTime != curFrameTime) { 154 | localVideoContext->textureObj->upload(ptrY, ptrU, ptrV); 155 | localVideoContext->lastUpdateTime = (float)curFrameTime; 156 | localVideoContext->isContentReady = true; 157 | } 158 | localAVHandler->freeVideoFrame(); 159 | } 160 | } 161 | } 162 | ctx->Release(); 163 | } 164 | } 165 | 166 | int nativeCreateDecoderAsync(const char* filePath, int& id) { 167 | LOG("Query available decoder id. \n"); 168 | 169 | int newID = 0; 170 | shared_ptr videoCtx; 171 | while (getVideoContext(newID, videoCtx)) { newID++; } 172 | 173 | videoCtx = make_shared(); 174 | videoCtx->avhandler = make_shared(); 175 | videoCtx->id = newID; 176 | id = videoCtx->id; 177 | videoCtx->path = string(filePath); 178 | videoCtx->isContentReady = false; 179 | 180 | videoCtx->initThread = thread([videoCtx]() { 181 | videoCtx->avhandler->init(videoCtx->path.c_str()); 182 | }); 183 | 184 | videoContexts.push_back(videoCtx); 185 | 186 | return 0; 187 | } 188 | 189 | // Synchronized init. Used for thumbnail currently. 190 | int nativeCreateDecoder(const char* filePath, int& id) { 191 | LOG("Query available decoder id. \n"); 192 | 193 | int newID = 0; 194 | shared_ptr videoCtx; 195 | while (getVideoContext(newID, videoCtx)) { newID++; } 196 | 197 | videoCtx->avhandler = make_shared(); 198 | videoCtx->id = newID; 199 | id = videoCtx->id; 200 | videoCtx->path = string(filePath); 201 | videoCtx->isContentReady = false; 202 | videoCtx->avhandler->init(filePath); 203 | 204 | videoContexts.push_back(videoCtx); 205 | 206 | return 0; 207 | } 208 | 209 | int nativeGetDecoderState(int id) { 210 | shared_ptr videoCtx; 211 | if (!getVideoContext(id, videoCtx) || videoCtx->avhandler == NULL) { return -1; } 212 | 213 | return videoCtx->avhandler->getDecoderState(); 214 | } 215 | 216 | void nativeCreateTexture(int id, void*& tex0, void*& tex1, void*& tex2) { 217 | shared_ptr videoCtx; 218 | if (!getVideoContext(id, videoCtx) || videoCtx->textureObj == NULL) { return; } 219 | 220 | videoCtx->textureObj->getResourcePointers(tex0, tex1, tex2); 221 | } 222 | 223 | bool nativeStartDecoding(int id) { 224 | shared_ptr videoCtx; 225 | if (!getVideoContext(id, videoCtx) || videoCtx->avhandler == NULL) { return false; } 226 | 227 | if (videoCtx->initThread.joinable()) { 228 | videoCtx->initThread.join(); 229 | } 230 | 231 | auto avhandler = videoCtx->avhandler; 232 | if (avhandler->getDecoderState() >= AVHandler::DecoderState::INITIALIZED) { 233 | avhandler->startDecoding(); 234 | } 235 | 236 | if (!avhandler->getVideoInfo().isEnabled) { 237 | videoCtx->isContentReady = true; 238 | } 239 | 240 | return true; 241 | } 242 | 243 | void nativeDestroyDecoder(int id) { 244 | shared_ptr videoCtx; 245 | if (!getVideoContext(id, videoCtx)) { return; } 246 | 247 | if (videoCtx->initThread.joinable()) { 248 | videoCtx->initThread.join(); 249 | } 250 | 251 | videoCtx->avhandler = NULL; 252 | 253 | videoCtx->path.clear(); 254 | videoCtx->progressTime = 0.0f; 255 | videoCtx->lastUpdateTime = 0.0f; 256 | 257 | videoCtx->textureObj = NULL; 258 | 259 | videoCtx->isContentReady = false; 260 | removeVideoContext(videoCtx->id); 261 | videoCtx->id = -1; 262 | } 263 | 264 | // Video 265 | bool nativeIsVideoEnabled(int id) { 266 | shared_ptr videoCtx; 267 | if (!getVideoContext(id, videoCtx)) { return false; } 268 | 269 | if (videoCtx->avhandler->getDecoderState() < AVHandler::DecoderState::INITIALIZED) { 270 | LOG("Decoder is unavailable currently. \n"); 271 | return false; 272 | } 273 | 274 | bool ret = videoCtx->avhandler->getVideoInfo().isEnabled; 275 | LOG("nativeIsVideoEnabled: %s \n", ret ? "true" : "false"); 276 | return ret; 277 | } 278 | 279 | void nativeGetVideoFormat(int id, int& width, int& height, float& totalTime) { 280 | shared_ptr videoCtx; 281 | if (!getVideoContext(id, videoCtx)) { return; } 282 | 283 | if (videoCtx->avhandler->getDecoderState() < AVHandler::DecoderState::INITIALIZED) { 284 | LOG("Decoder is unavailable currently. \n"); 285 | return; 286 | } 287 | 288 | IDecoder::VideoInfo* videoInfo = &(videoCtx->avhandler->getVideoInfo()); 289 | width = videoInfo->width; 290 | height = videoInfo->height; 291 | totalTime = (float)(videoInfo->totalTime); 292 | } 293 | 294 | void nativeSetVideoTime(int id, float currentTime) { 295 | shared_ptr videoCtx; 296 | if (!getVideoContext(id, videoCtx)) { return; } 297 | 298 | videoCtx->progressTime = currentTime; 299 | } 300 | 301 | bool nativeIsAudioEnabled(int id) { 302 | shared_ptr videoCtx; 303 | if (!getVideoContext(id, videoCtx)) { return false; } 304 | 305 | if (videoCtx->avhandler->getDecoderState() < AVHandler::DecoderState::INITIALIZED) { 306 | LOG("Decoder is unavailable currently. \n"); 307 | return false; 308 | } 309 | 310 | bool ret = videoCtx->avhandler->getAudioInfo().isEnabled; 311 | LOG("nativeIsAudioEnabled: %s \n", ret ? "true" : "false"); 312 | return ret; 313 | } 314 | 315 | void nativeGetAudioFormat(int id, int& channel, int& frequency, float& totalTime) { 316 | shared_ptr videoCtx; 317 | if (!getVideoContext(id, videoCtx)) { return; } 318 | 319 | if (videoCtx->avhandler->getDecoderState() < AVHandler::DecoderState::INITIALIZED) { 320 | LOG("Decoder is unavailable currently. \n"); 321 | return; 322 | } 323 | 324 | IDecoder::AudioInfo* audioInfo = &(videoCtx->avhandler->getAudioInfo()); 325 | channel = audioInfo->channels; 326 | frequency = audioInfo->sampleRate; 327 | totalTime = (float)(audioInfo->totalTime); 328 | } 329 | 330 | float nativeGetAudioData(int id, unsigned char** audioData, int& frameSize) { 331 | shared_ptr videoCtx; 332 | if (!getVideoContext(id, videoCtx)) { return -1.0f; } 333 | 334 | return (float) (videoCtx->avhandler->getAudioFrame(audioData, frameSize)); 335 | } 336 | 337 | void nativeFreeAudioData(int id) { 338 | shared_ptr videoCtx; 339 | if (!getVideoContext(id, videoCtx)) { return; } 340 | 341 | videoCtx->avhandler->freeAudioFrame(); 342 | } 343 | 344 | void nativeSetSeekTime(int id, float sec) { 345 | shared_ptr videoCtx; 346 | if (!getVideoContext(id, videoCtx)) { return; } 347 | 348 | if (videoCtx->avhandler->getDecoderState() < AVHandler::DecoderState::INITIALIZED) { 349 | LOG("Decoder is unavailable currently. \n"); 350 | return; 351 | } 352 | 353 | LOG("nativeSetSeekTime %f. \n", sec); 354 | videoCtx->avhandler->setSeekTime(sec); 355 | if (!videoCtx->avhandler->getVideoInfo().isEnabled) { 356 | videoCtx->isContentReady = true; 357 | } else { 358 | videoCtx->isContentReady = false; 359 | } 360 | } 361 | 362 | bool nativeIsSeekOver(int id) { 363 | shared_ptr videoCtx; 364 | if (!getVideoContext(id, videoCtx)) { return false; } 365 | 366 | return !(videoCtx->avhandler->getDecoderState() == AVHandler::DecoderState::SEEK); 367 | } 368 | 369 | bool nativeIsVideoBufferFull(int id) { 370 | shared_ptr videoCtx; 371 | if (!getVideoContext(id, videoCtx)) { return false; } 372 | 373 | return videoCtx->avhandler->isVideoBufferFull(); 374 | } 375 | 376 | bool nativeIsVideoBufferEmpty(int id) { 377 | shared_ptr videoCtx; 378 | if (!getVideoContext(id, videoCtx)) { return false; } 379 | 380 | return videoCtx->avhandler->isVideoBufferEmpty(); 381 | } 382 | 383 | /* This function is for thumbnail extraction.*/ 384 | void nativeLoadThumbnail(int id, float time, void* texY, void* texU, void* texV) { 385 | if (g_D3D11Device == NULL) { 386 | LOG("g_D3D11Device is null. \n"); 387 | return; 388 | } 389 | 390 | shared_ptr videoCtx; 391 | if (!getVideoContext(id, videoCtx)) { return; } 392 | 393 | // 1.Initialize variable and texture 394 | AVHandler* avhandler = videoCtx->avhandler.get(); 395 | IDecoder::VideoInfo* videoInfo = &(avhandler->getVideoInfo()); 396 | int width = (int) (ceil((float) videoInfo->width / ITextureObject::CPU_ALIGMENT) * ITextureObject::CPU_ALIGMENT); 397 | int height = videoInfo->height; 398 | 399 | // 2.Get thumbnail data and update texture 400 | avhandler->setSeekTime(time); 401 | thread thumbnailThread([&]() { 402 | uint8_t* yptr = NULL, *uptr = NULL, *vptr = NULL; 403 | 404 | avhandler->getVideoFrame(&yptr, &uptr, &vptr); 405 | while (yptr == NULL) { 406 | avhandler->freeVideoFrame(); 407 | avhandler->getVideoFrame(&yptr, &uptr, &vptr); 408 | } 409 | 410 | ID3D11DeviceContext* ctx = NULL; 411 | ID3D11Texture2D* d3dtex0 = (ID3D11Texture2D*)texY; 412 | ID3D11Texture2D* d3dtex1 = (ID3D11Texture2D*)texU; 413 | ID3D11Texture2D* d3dtex2 = (ID3D11Texture2D*)texV; 414 | 415 | g_D3D11Device->GetImmediateContext(&ctx); 416 | ctx->UpdateSubresource(d3dtex0, 0, NULL, yptr, width, 0); 417 | ctx->UpdateSubresource(d3dtex1, 0, NULL, uptr, width / 2, 0); 418 | ctx->UpdateSubresource(d3dtex2, 0, NULL, vptr, width / 2, 0); 419 | ctx->Release(); 420 | }); 421 | 422 | if (thumbnailThread.joinable()) { 423 | thumbnailThread.join(); 424 | } 425 | } 426 | 427 | int nativeGetMetaData(const char* filePath, char*** key, char*** value) { 428 | unique_ptr avhandler = make_unique(); 429 | avhandler->init(filePath); 430 | 431 | char** metaKey = NULL; 432 | char** metaValue = NULL; 433 | int metaCount = avhandler->getMetaData(metaKey, metaValue); 434 | 435 | *key = (char**)CoTaskMemAlloc(sizeof(char*) * metaCount); 436 | *value = (char**)CoTaskMemAlloc(sizeof(char*) * metaCount); 437 | 438 | for (int i = 0; i < metaCount; i++) { 439 | (*key)[i] = (char*)CoTaskMemAlloc(strlen(metaKey[i]) + 1); 440 | (*value)[i] = (char*)CoTaskMemAlloc(strlen(metaValue[i]) + 1); 441 | strcpy_s((*key)[i], strlen(metaKey[i]) + 1, metaKey[i]); 442 | strcpy_s((*value)[i], strlen(metaValue[i]) + 1, metaValue[i]); 443 | } 444 | 445 | free(metaKey); 446 | free(metaValue); 447 | 448 | return metaCount; 449 | } 450 | 451 | bool nativeIsContentReady(int id) { 452 | shared_ptr videoCtx; 453 | if (!getVideoContext(id, videoCtx)) { return false; } 454 | 455 | return videoCtx->isContentReady; 456 | } 457 | 458 | void nativeSetVideoEnable(int id, bool isEnable) { 459 | shared_ptr videoCtx; 460 | if (!getVideoContext(id, videoCtx)) { return; } 461 | 462 | videoCtx->avhandler->setVideoEnable(isEnable); 463 | } 464 | 465 | void nativeSetAudioEnable(int id, bool isEnable) { 466 | shared_ptr videoCtx; 467 | if (!getVideoContext(id, videoCtx)) { return; } 468 | 469 | videoCtx->avhandler->setAudioEnable(isEnable); 470 | } 471 | 472 | void nativeSetAudioAllChDataEnable(int id, bool isEnable) { 473 | shared_ptr videoCtx; 474 | if (!getVideoContext(id, videoCtx)) { return; } 475 | 476 | videoCtx->avhandler->setAudioAllChDataEnable(isEnable); 477 | } 478 | 479 | bool nativeIsEOF(int id) { 480 | shared_ptr videoCtx; 481 | if (!getVideoContext(id, videoCtx) || videoCtx->avhandler == NULL) { return true; } 482 | 483 | return videoCtx->avhandler->getDecoderState() == AVHandler::DecoderState::DECODE_EOF; 484 | } 485 | 486 | //extern "C" EXPORT_API void nativeGetTextureType(void* ptr0) { 487 | // ID3D11Texture2D* d3dtex = (ID3D11Texture2D*)(ptr0); 488 | // D3D11_TEXTURE2D_DESC desc; 489 | // d3dtex->GetDesc(&desc); 490 | // LOG("Texture format = %d \n", desc.Format); 491 | //} 492 | 493 | //int saveByteBuffer(const unsigned char* buff, int fileLength, const char* filePath){ 494 | // FILE *fout = fopen(filePath, "wb+"); 495 | // if (!fout){ 496 | // printf("Can't open output file\n"); 497 | // return 1; 498 | // } 499 | // fwrite(buff, sizeof(char)*fileLength, 1, fout); 500 | // fclose(fout); 501 | // return NO_ERROR; 502 | //} -------------------------------------------------------------------------------- /NativeCode/ViveMediaDecoder.def: -------------------------------------------------------------------------------- 1 | ; file used by Visual Studio plugin builds, mostly for 32-bit 2 | ; to stop mangling our exported function names 3 | 4 | LIBRARY 5 | 6 | EXPORTS 7 | UnityPluginLoad 8 | UnityPluginUnload -------------------------------------------------------------------------------- /NativeCode/ViveMediaDecoder.h: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | #pragma once 4 | 5 | // Which platform we are on? 6 | #if _MSC_VER 7 | #define UNITY_WIN 1 8 | #endif 9 | 10 | // Which graphics device APIs we possibly support? 11 | #if UNITY_WIN 12 | #define SUPPORT_D3D11 1 13 | #endif 14 | 15 | #if SUPPORT_D3D11 16 | # include 17 | # include "Unity\IUnityGraphicsD3D11.h" 18 | #endif 19 | 20 | extern "C" { 21 | // Decoder 22 | __declspec(dllexport) int nativeCreateDecoder(const char* filePath, int& id); 23 | __declspec(dllexport) int nativeCreateDecoderAsync(const char* filePath, int& id); 24 | __declspec(dllexport) int nativeGetDecoderState(int id); 25 | __declspec(dllexport) void nativeCreateTexture(int id, void*& tex0, void*& tex1, void*& tex2); 26 | __declspec(dllexport) bool nativeStartDecoding(int id); 27 | __declspec(dllexport) void nativeDestroyDecoder(int id); 28 | __declspec(dllexport) bool nativeIsEOF(int id); 29 | // Video 30 | __declspec(dllexport) bool nativeIsVideoEnabled(int id); 31 | __declspec(dllexport) void nativeSetVideoEnable(int id, bool isEnable); 32 | __declspec(dllexport) void nativeGetVideoFormat(int id, int& width, int& height, float& totalTime); 33 | __declspec(dllexport) void nativeSetVideoTime(int id, float currentTime); 34 | __declspec(dllexport) bool nativeIsContentReady(int id); 35 | __declspec(dllexport) bool nativeIsVideoBufferFull(int id); 36 | __declspec(dllexport) bool nativeIsVideoBufferEmpty(int id); 37 | // Audio 38 | __declspec(dllexport) bool nativeIsAudioEnabled(int id); 39 | __declspec(dllexport) void nativeSetAudioEnable(int id, bool isEnable); 40 | __declspec(dllexport) void nativeSetAudioAllChDataEnable(int id, bool isEnable); 41 | __declspec(dllexport) void nativeGetAudioFormat(int id, int& channel, int& frequency, float& totalTime); 42 | __declspec(dllexport) float nativeGetAudioData(int id, unsigned char** audioData, int& frameSize); 43 | __declspec(dllexport) void nativeFreeAudioData(int id); 44 | // Seek 45 | __declspec(dllexport) void nativeSetSeekTime(int id, float sec); 46 | __declspec(dllexport) bool nativeIsSeekOver(int id); 47 | // Utility 48 | __declspec(dllexport) int nativeGetMetaData(const char* filePath, char*** key, char*** value); 49 | __declspec(dllexport) void nativeLoadThumbnail(int id, float time, void* texY, void* texU, void* texV); 50 | } -------------------------------------------------------------------------------- /NativeCode/include/Unity/IUnityGraphics.h: -------------------------------------------------------------------------------- 1 | #pragma once 2 | #include "Unity\IUnityInterface.h" 3 | 4 | typedef enum UnityGfxRenderer 5 | { 6 | kUnityGfxRendererOpenGL = 0, // Legacy OpenGL 7 | kUnityGfxRendererD3D9 = 1, // Direct3D 9 8 | kUnityGfxRendererD3D11 = 2, // Direct3D 11 9 | kUnityGfxRendererGCM = 3, // PlayStation 3 10 | kUnityGfxRendererNull = 4, // "null" device (used in batch mode) 11 | kUnityGfxRendererXenon = 6, // Xbox 360 12 | kUnityGfxRendererOpenGLES20 = 8, // OpenGL ES 2.0 13 | kUnityGfxRendererOpenGLES30 = 11, // OpenGL ES 3.x 14 | kUnityGfxRendererGXM = 12, // PlayStation Vita 15 | kUnityGfxRendererPS4 = 13, // PlayStation 4 16 | kUnityGfxRendererXboxOne = 14, // Xbox One 17 | kUnityGfxRendererMetal = 16, // iOS Metal 18 | kUnityGfxRendererOpenGLCore = 17, // OpenGL core 19 | kUnityGfxRendererD3D12 = 18, // Direct3D 12 20 | } UnityGfxRenderer; 21 | 22 | typedef enum UnityGfxDeviceEventType 23 | { 24 | kUnityGfxDeviceEventInitialize = 0, 25 | kUnityGfxDeviceEventShutdown = 1, 26 | kUnityGfxDeviceEventBeforeReset = 2, 27 | kUnityGfxDeviceEventAfterReset = 3, 28 | } UnityGfxDeviceEventType; 29 | 30 | typedef void (UNITY_INTERFACE_API * IUnityGraphicsDeviceEventCallback)(UnityGfxDeviceEventType eventType); 31 | 32 | // Should only be used on the rendering thread unless noted otherwise. 33 | UNITY_DECLARE_INTERFACE(IUnityGraphics) 34 | { 35 | UnityGfxRenderer (UNITY_INTERFACE_API * GetRenderer)(); // Thread safe 36 | 37 | // This callback will be called when graphics device is created, destroyed, reset, etc. 38 | // It is possible to miss the kUnityGfxDeviceEventInitialize event in case plugin is loaded at a later time, 39 | // when the graphics device is already created. 40 | void (UNITY_INTERFACE_API * RegisterDeviceEventCallback)(IUnityGraphicsDeviceEventCallback callback); 41 | void (UNITY_INTERFACE_API * UnregisterDeviceEventCallback)(IUnityGraphicsDeviceEventCallback callback); 42 | }; 43 | UNITY_REGISTER_INTERFACE_GUID(0x7CBA0A9CA4DDB544ULL,0x8C5AD4926EB17B11ULL,IUnityGraphics) 44 | 45 | 46 | 47 | // Certain Unity APIs (GL.IssuePluginEvent, CommandBuffer.IssuePluginEvent) can callback into native plugins. 48 | // Provide them with an address to a function of this signature. 49 | typedef void (UNITY_INTERFACE_API * UnityRenderingEvent)(int eventId); 50 | -------------------------------------------------------------------------------- /NativeCode/include/Unity/IUnityGraphicsD3D11.h: -------------------------------------------------------------------------------- 1 | #pragma once 2 | #include "Unity\IUnityInterface.h" 3 | 4 | // Should only be used on the rendering thread unless noted otherwise. 5 | UNITY_DECLARE_INTERFACE(IUnityGraphicsD3D11) 6 | { 7 | ID3D11Device* (UNITY_INTERFACE_API * GetDevice)(); 8 | }; 9 | UNITY_REGISTER_INTERFACE_GUID(0xAAB37EF87A87D748ULL,0xBF76967F07EFB177ULL,IUnityGraphicsD3D11) 10 | -------------------------------------------------------------------------------- /NativeCode/include/Unity/IUnityInterface.h: -------------------------------------------------------------------------------- 1 | #pragma once 2 | 3 | // Unity native plugin API 4 | // Compatible with C99 5 | 6 | #if defined(__CYGWIN32__) 7 | #define UNITY_INTERFACE_API __stdcall 8 | #define UNITY_INTERFACE_EXPORT __declspec(dllexport) 9 | #elif defined(WIN32) || defined(_WIN32) || defined(__WIN32__) || defined(_WIN64) || defined(WINAPI_FAMILY) 10 | #define UNITY_INTERFACE_API __stdcall 11 | #define UNITY_INTERFACE_EXPORT __declspec(dllexport) 12 | #elif defined(__MACH__) || defined(__ANDROID__) || defined(__linux__) || defined(__QNX__) 13 | #define UNITY_INTERFACE_API 14 | #define UNITY_INTERFACE_EXPORT 15 | #else 16 | #define UNITY_INTERFACE_API 17 | #define UNITY_INTERFACE_EXPORT 18 | #endif 19 | 20 | 21 | 22 | // Unity Interface GUID 23 | // Ensures cross plugin uniqueness. 24 | // 25 | // Template specialization is used to produce a means of looking up a GUID from it's payload type at compile time. 26 | // The net result should compile down to passing around the GUID. 27 | // 28 | // UNITY_REGISTER_INTERFACE_GUID should be placed in the header file of any payload definition outside of all namespaces. 29 | // The payload structure and the registration GUID are all that is required to expose the interface to other systems. 30 | struct UnityInterfaceGUID 31 | { 32 | #ifdef __cplusplus 33 | UnityInterfaceGUID(unsigned long long high, unsigned long long low) 34 | : m_GUIDHigh(high) 35 | , m_GUIDLow(low) 36 | { 37 | } 38 | 39 | UnityInterfaceGUID(const UnityInterfaceGUID& other) 40 | { 41 | m_GUIDHigh = other.m_GUIDHigh; 42 | m_GUIDLow = other.m_GUIDLow; 43 | } 44 | 45 | UnityInterfaceGUID& operator=(const UnityInterfaceGUID& other) 46 | { 47 | m_GUIDHigh = other.m_GUIDHigh; 48 | m_GUIDLow = other.m_GUIDLow; 49 | return *this; 50 | } 51 | 52 | bool Equals(const UnityInterfaceGUID& other) const { return m_GUIDHigh == other.m_GUIDHigh && m_GUIDLow == other.m_GUIDLow; } 53 | bool LessThan(const UnityInterfaceGUID& other) const { return m_GUIDHigh < other.m_GUIDHigh || (m_GUIDHigh == other.m_GUIDHigh && m_GUIDLow < other.m_GUIDLow); } 54 | #endif 55 | unsigned long long m_GUIDHigh; 56 | unsigned long long m_GUIDLow; 57 | }; 58 | #ifdef __cplusplus 59 | inline bool operator==(const UnityInterfaceGUID& left, const UnityInterfaceGUID& right) { return left.Equals(right); } 60 | inline bool operator!=(const UnityInterfaceGUID& left, const UnityInterfaceGUID& right) { return !left.Equals(right); } 61 | inline bool operator< (const UnityInterfaceGUID& left, const UnityInterfaceGUID& right) { return left.LessThan(right); } 62 | inline bool operator> (const UnityInterfaceGUID& left, const UnityInterfaceGUID& right) { return right.LessThan(left); } 63 | inline bool operator>=(const UnityInterfaceGUID& left, const UnityInterfaceGUID& right) { return !operator< (left,right); } 64 | inline bool operator<=(const UnityInterfaceGUID& left, const UnityInterfaceGUID& right) { return !operator> (left,right); } 65 | #else 66 | typedef struct UnityInterfaceGUID UnityInterfaceGUID; 67 | #endif 68 | 69 | 70 | 71 | #define UNITY_GET_INTERFACE_GUID(TYPE) TYPE##_GUID 72 | #define UNITY_GET_INTERFACE(INTERFACES, TYPE) (TYPE*)INTERFACES->GetInterface(UNITY_GET_INTERFACE_GUID(TYPE)); 73 | 74 | #ifdef __cplusplus 75 | #define UNITY_DECLARE_INTERFACE(NAME) \ 76 | struct NAME : IUnityInterface 77 | 78 | template \ 79 | inline const UnityInterfaceGUID GetUnityInterfaceGUID(); \ 80 | 81 | #define UNITY_REGISTER_INTERFACE_GUID(HASHH, HASHL, TYPE) \ 82 | const UnityInterfaceGUID TYPE##_GUID(HASHH, HASHL); \ 83 | template<> \ 84 | inline const UnityInterfaceGUID GetUnityInterfaceGUID() \ 85 | { \ 86 | return UNITY_GET_INTERFACE_GUID(TYPE); \ 87 | } 88 | #else 89 | #define UNITY_DECLARE_INTERFACE(NAME) \ 90 | typedef struct NAME NAME; \ 91 | struct NAME 92 | 93 | #define UNITY_REGISTER_INTERFACE_GUID(HASHH, HASHL, TYPE) \ 94 | const UnityInterfaceGUID TYPE##_GUID = {HASHH, HASHL}; 95 | #endif 96 | 97 | 98 | 99 | #ifdef __cplusplus 100 | struct IUnityInterface 101 | { 102 | }; 103 | #else 104 | typedef void IUnityInterface; 105 | #endif 106 | 107 | 108 | 109 | typedef struct IUnityInterfaces 110 | { 111 | // Returns an interface matching the guid. 112 | // Returns nullptr if the given interface is unavailable in the active Unity runtime. 113 | IUnityInterface* (UNITY_INTERFACE_API * GetInterface)(UnityInterfaceGUID guid); 114 | 115 | // Registers a new interface. 116 | void (UNITY_INTERFACE_API * RegisterInterface)(UnityInterfaceGUID guid, IUnityInterface* ptr); 117 | 118 | #ifdef __cplusplus 119 | // Helper for GetInterface. 120 | template 121 | INTERFACE* Get() 122 | { 123 | return static_cast(GetInterface(GetUnityInterfaceGUID())); 124 | } 125 | 126 | // Helper for RegisterInterface. 127 | template 128 | void Register(IUnityInterface* ptr) 129 | { 130 | RegisterInterface(GetUnityInterfaceGUID(), ptr); 131 | } 132 | #endif 133 | } IUnityInterfaces; 134 | 135 | 136 | 137 | #ifdef __cplusplus 138 | extern "C" { 139 | #endif 140 | 141 | // If exported by a plugin, this function will be called when the plugin is loaded. 142 | void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API UnityPluginLoad(IUnityInterfaces* unityInterfaces); 143 | // If exported by a plugin, this function will be called when the plugin is about to be unloaded. 144 | void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API UnityPluginUnload(); 145 | 146 | #ifdef __cplusplus 147 | } 148 | #endif 149 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ViveMediaDecoder 2 | Copyright (c) 2015-2019, HTC Corporation. All rights reserved. 3 | 4 | ## Introduction: 5 | - ViveMediaDecoder is a high performance video decoding Unity plugin for Windows 6 | which supports streaming multiple formats. 7 | 8 | - We also provide the samples for playing various video types which includes 9 | 2D, stereo 2D, 360, stereo 360. You can build custom VR video player easily 10 | through this plugin. 11 | 12 | - This software uses FFmpeg 3.4 licensed under LGPL license. 13 | To review the LGPL license, look [here](https://www.gnu.org/licenses/lgpl-3.0.en.html). 14 | The FFmpeg 3.4 source code can be downloaded from [here](https://ffmpeg.org/releases/ffmpeg-3.4.tar.bz2). 15 | 16 | ## Requirements: 17 | - Windows 7 18 | - DirectX 11 19 | - Unity 5 20 | - FFmpeg 3.4 -------------------------------------------------------------------------------- /UnityPackage/Copyright notice.txt: -------------------------------------------------------------------------------- 1 | ViveMediaDecoder for Unity - v1.1.7 2 | Copyright 2015-2019 HTC Corporation. All Rights Reserved. 3 | 4 | The works ("Work") refer to this software provided by HTC Corporation ("HTC") and published in Unity Store. 5 | This Work distributed from Unity Store is licensed under the Asset Store End User License Agreement (Asset Store EULA). 6 | Please review and follow the terms of Asset Store EULA on http://unity3d.com/pt/legal/as_terms and its subsequent revisions if any. 7 | Unless otherwise provided herein, the information contained in the Work is the exclusive property of HTC. -------------------------------------------------------------------------------- /UnityPackage/Copyright notice.txt.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 70458e7816c009345adcb8e51ae59712 3 | timeCreated: 1473231842 4 | licenseType: Store 5 | TextScriptImporter: 6 | userData: 7 | assetBundleName: 8 | assetBundleVariant: 9 | -------------------------------------------------------------------------------- /UnityPackage/Materials.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 3d4d447cd589132459b4dd46600efdac 3 | folderAsset: yes 4 | timeCreated: 1452740837 5 | licenseType: Store 6 | DefaultImporter: 7 | userData: 8 | assetBundleName: 9 | assetBundleVariant: 10 | -------------------------------------------------------------------------------- /UnityPackage/Materials/YUV2RGBA.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ViveSoftware/ViveMediaDecoder/97eabe8d21dcfd945103dcde39f5dff5342bdc01/UnityPackage/Materials/YUV2RGBA.mat -------------------------------------------------------------------------------- /UnityPackage/Materials/YUV2RGBA.mat.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 8d69b9fde0aea284291e3e133a2e16ba 3 | timeCreated: 1438244006 4 | licenseType: Store 5 | NativeFormatImporter: 6 | userData: 7 | assetBundleName: 8 | assetBundleVariant: 9 | -------------------------------------------------------------------------------- /UnityPackage/Plugins.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: efa8a84b73c8a484d81e581bef8dd4fa 3 | folderAsset: yes 4 | timeCreated: 1452740811 5 | licenseType: Store 6 | DefaultImporter: 7 | userData: 8 | assetBundleName: 9 | assetBundleVariant: 10 | -------------------------------------------------------------------------------- /UnityPackage/Plugins/x86.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 02acada84185f8646aa6ba78a7579006 3 | folderAsset: yes 4 | timeCreated: 1466675386 5 | licenseType: Store 6 | DefaultImporter: 7 | userData: 8 | assetBundleName: 9 | assetBundleVariant: 10 | -------------------------------------------------------------------------------- /UnityPackage/Plugins/x86/ViveMediaDecoder.dll: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ViveSoftware/ViveMediaDecoder/97eabe8d21dcfd945103dcde39f5dff5342bdc01/UnityPackage/Plugins/x86/ViveMediaDecoder.dll -------------------------------------------------------------------------------- /UnityPackage/Plugins/x86/ViveMediaDecoder.dll.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 742f4a0ef347b334e8039f41a4aa0221 3 | timeCreated: 1466675386 4 | licenseType: Pro 5 | PluginImporter: 6 | serializedVersion: 1 7 | iconMap: {} 8 | executionOrder: {} 9 | isPreloaded: 0 10 | platformData: 11 | Any: 12 | enabled: 1 13 | settings: {} 14 | Editor: 15 | enabled: 0 16 | settings: 17 | CPU: x86 18 | DefaultValueInitialized: true 19 | OS: Windows 20 | Linux: 21 | enabled: 1 22 | settings: 23 | CPU: x86 24 | Linux64: 25 | enabled: 0 26 | settings: 27 | CPU: None 28 | LinuxUniversal: 29 | enabled: 1 30 | settings: 31 | CPU: x86 32 | OSXIntel: 33 | enabled: 1 34 | settings: 35 | CPU: AnyCPU 36 | OSXIntel64: 37 | enabled: 0 38 | settings: 39 | CPU: None 40 | OSXUniversal: 41 | enabled: 0 42 | settings: 43 | CPU: x86 44 | Win: 45 | enabled: 1 46 | settings: 47 | CPU: AnyCPU 48 | Win64: 49 | enabled: 0 50 | settings: 51 | CPU: None 52 | userData: 53 | assetBundleName: 54 | assetBundleVariant: 55 | -------------------------------------------------------------------------------- /UnityPackage/Plugins/x86_64.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 3834cbe3e18049e4994db24a55e4daa5 3 | folderAsset: yes 4 | timeCreated: 1452740818 5 | licenseType: Store 6 | DefaultImporter: 7 | userData: 8 | assetBundleName: 9 | assetBundleVariant: 10 | -------------------------------------------------------------------------------- /UnityPackage/Plugins/x86_64/ViveMediaDecoder.dll: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ViveSoftware/ViveMediaDecoder/97eabe8d21dcfd945103dcde39f5dff5342bdc01/UnityPackage/Plugins/x86_64/ViveMediaDecoder.dll -------------------------------------------------------------------------------- /UnityPackage/Plugins/x86_64/ViveMediaDecoder.dll.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: c9926958caada8244897ad89dc4a8f96 3 | timeCreated: 1484652456 4 | licenseType: Pro 5 | PluginImporter: 6 | serializedVersion: 1 7 | iconMap: {} 8 | executionOrder: {} 9 | isPreloaded: 0 10 | isOverridable: 0 11 | platformData: 12 | Any: 13 | enabled: 1 14 | settings: {} 15 | Editor: 16 | enabled: 0 17 | settings: 18 | CPU: x86_64 19 | DefaultValueInitialized: true 20 | Linux: 21 | enabled: 0 22 | settings: 23 | CPU: None 24 | Linux64: 25 | enabled: 1 26 | settings: 27 | CPU: x86_64 28 | LinuxUniversal: 29 | enabled: 1 30 | settings: 31 | CPU: x86_64 32 | OSXIntel: 33 | enabled: 0 34 | settings: 35 | CPU: None 36 | OSXIntel64: 37 | enabled: 1 38 | settings: 39 | CPU: AnyCPU 40 | OSXUniversal: 41 | enabled: 0 42 | settings: 43 | CPU: x86_64 44 | Win: 45 | enabled: 0 46 | settings: 47 | CPU: None 48 | Win64: 49 | enabled: 1 50 | settings: 51 | CPU: AnyCPU 52 | userData: 53 | assetBundleName: 54 | assetBundleVariant: 55 | -------------------------------------------------------------------------------- /UnityPackage/Scenes.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 8bb80c0a75526cc44b3d406f5f57d155 3 | folderAsset: yes 4 | timeCreated: 1452740849 5 | licenseType: Store 6 | DefaultImporter: 7 | userData: 8 | assetBundleName: 9 | assetBundleVariant: 10 | -------------------------------------------------------------------------------- /UnityPackage/Scenes/Demo.unity: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ViveSoftware/ViveMediaDecoder/97eabe8d21dcfd945103dcde39f5dff5342bdc01/UnityPackage/Scenes/Demo.unity -------------------------------------------------------------------------------- /UnityPackage/Scenes/Demo.unity.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: feee831dbdd516842a6c5d31ae54864e 3 | timeCreated: 1464333231 4 | licenseType: Store 5 | DefaultImporter: 6 | userData: 7 | assetBundleName: 8 | assetBundleVariant: 9 | -------------------------------------------------------------------------------- /UnityPackage/Scenes/SampleScene.unity: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ViveSoftware/ViveMediaDecoder/97eabe8d21dcfd945103dcde39f5dff5342bdc01/UnityPackage/Scenes/SampleScene.unity -------------------------------------------------------------------------------- /UnityPackage/Scenes/SampleScene.unity.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: b231e9d754212544eaf2e270a94cbae9 3 | timeCreated: 1453120794 4 | licenseType: Store 5 | DefaultImporter: 6 | userData: 7 | assetBundleName: 8 | assetBundleVariant: 9 | -------------------------------------------------------------------------------- /UnityPackage/Scripts.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 682d87da43b7e5c45ac9385138b8952f 3 | folderAsset: yes 4 | timeCreated: 1452740803 5 | licenseType: Store 6 | DefaultImporter: 7 | userData: 8 | assetBundleName: 9 | assetBundleVariant: 10 | -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: dfb1b19b9ba1ee046949210236a96841 3 | folderAsset: yes 4 | timeCreated: 1455866104 5 | licenseType: Store 6 | DefaultImporter: 7 | userData: 8 | assetBundleName: 9 | assetBundleVariant: 10 | -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/FileSeeker.cs: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | using UnityEngine; 4 | using System.IO; 5 | using System.Collections; 6 | using System.Collections.Generic; 7 | 8 | namespace HTC.UnityPlugin.Multimedia 9 | { 10 | public class FileSeeker { 11 | private const string LOG_TAG = "[FileSeeker]"; 12 | 13 | private List contentInfo = new List(); 14 | public int contentIndex { get; private set; } 15 | 16 | public bool loadFolder(string path, string filter) { 17 | if (!Directory.Exists(path)) { 18 | Debug.Log(LOG_TAG + " path invalid."); 19 | return false; 20 | } 21 | 22 | string[] multipleFilter = filter.Split(new char[] { '|' }); 23 | DirectoryInfo dir = new DirectoryInfo(path); 24 | contentInfo.Clear(); 25 | for (int i = 0; i < multipleFilter.Length; i++) { 26 | contentInfo.AddRange(dir.GetFiles(multipleFilter[i])); 27 | } 28 | 29 | dir = null; 30 | Debug.Log(LOG_TAG + " Found " + contentInfo.Count + " files."); 31 | return contentInfo.Count > 0; 32 | } 33 | 34 | public string getPath() { 35 | if (contentInfo.Count > 0) { 36 | string path = contentInfo[contentIndex].FullName; 37 | Debug.Log(LOG_TAG + " Get path " + path); 38 | return path; 39 | } else { 40 | Debug.Log(LOG_TAG + " No path"); 41 | return ""; 42 | } 43 | } 44 | 45 | public string[] getPathAll() { 46 | string[] pathList = new string[contentInfo.Count]; 47 | for (int i = 0; i < contentInfo.Count; i++) { 48 | pathList[i] = contentInfo[i].FullName; 49 | } 50 | 51 | return pathList; 52 | } 53 | 54 | public int getFileNum() { 55 | return contentInfo.Count; 56 | } 57 | 58 | // Index control 59 | public int toNext() { 60 | if (contentInfo.Count > 0) { 61 | contentIndex = (contentIndex + 1) % contentInfo.Count; 62 | Debug.Log(LOG_TAG + " To next file index:" + contentIndex); 63 | } 64 | 65 | return contentIndex; 66 | } 67 | 68 | public int toPrev() { 69 | if (contentInfo.Count > 0) { 70 | contentIndex = (contentIndex + contentInfo.Count - 1) % contentInfo.Count; 71 | Debug.Log(LOG_TAG + " To prev file index:" + contentIndex); 72 | } 73 | 74 | return contentIndex; 75 | } 76 | 77 | public void setIndex(int index) { 78 | if (contentInfo.Count > 0) { 79 | contentIndex = index % contentInfo.Count; 80 | Debug.Log(LOG_TAG + " To file index:" + contentIndex); 81 | } 82 | } 83 | } 84 | } -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/FileSeeker.cs.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 9189184545f07c94a9091b5130f1ccd7 3 | timeCreated: 1432804739 4 | licenseType: Store 5 | MonoImporter: 6 | serializedVersion: 2 7 | defaultReferences: [] 8 | executionOrder: 0 9 | icon: {instanceID: 0} 10 | userData: 11 | assetBundleName: 12 | assetBundleVariant: 13 | -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/ImageSourceController.cs: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | using UnityEngine; 4 | using UnityEngine.Events; 5 | using System.Collections; 6 | 7 | namespace HTC.UnityPlugin.Multimedia 8 | { 9 | [RequireComponent(typeof(MeshRenderer))] 10 | public class ImageSourceController : MonoBehaviour { 11 | protected string LOG_TAG = "[ImageSourceController]"; 12 | 13 | public string folderPath; 14 | public string filter; 15 | public bool isAdaptToResolution; 16 | public UnityEvent onInitComplete; 17 | public UnityEvent onChangeImage; 18 | protected bool isInitialized = false; 19 | protected FileSeeker fileSeeker; 20 | 21 | protected Texture2D texture; 22 | protected Vector3 oriScale; 23 | 24 | protected virtual void Start() { 25 | initFileSeeker(); 26 | texture = new Texture2D(1, 1); 27 | texture.filterMode = FilterMode.Trilinear; 28 | texture.Apply(); 29 | GetComponent().material.mainTexture = texture; 30 | } 31 | 32 | public void initFileSeeker() { 33 | if (folderPath == null) { 34 | Debug.Log(LOG_TAG + "Folder path is null."); 35 | return; 36 | } 37 | 38 | isInitialized = false; 39 | 40 | fileSeeker = new FileSeeker(); 41 | if (!fileSeeker.loadFolder(folderPath, filter)) { 42 | Debug.Log(LOG_TAG + " content not found."); 43 | fileSeeker = null; 44 | return; 45 | } 46 | 47 | oriScale = transform.localScale; 48 | 49 | isInitialized = true; 50 | 51 | onInitComplete.Invoke(); 52 | } 53 | 54 | public void loadImage() { 55 | if (!isInitialized) { 56 | Debug.Log(LOG_TAG + " not initialized."); 57 | return; 58 | } 59 | 60 | StartCoroutine(loadImageCoroutine(fileSeeker.getPath())); 61 | } 62 | 63 | public void nextImage() { 64 | if (!isInitialized) { 65 | Debug.Log(LOG_TAG + " not initialized."); 66 | return; 67 | } 68 | 69 | fileSeeker.toNext(); 70 | 71 | onChangeImage.Invoke(); 72 | } 73 | 74 | public void prevImage() { 75 | if (!isInitialized) { 76 | Debug.Log(LOG_TAG + " not initialized."); 77 | return; 78 | } 79 | 80 | fileSeeker.toPrev(); 81 | 82 | onChangeImage.Invoke(); 83 | } 84 | 85 | protected IEnumerator loadImageCoroutine(string imagePath) { 86 | var www = new WWW("file://" + imagePath); 87 | yield return www; 88 | www.LoadImageIntoTexture(texture); 89 | 90 | if (isAdaptToResolution) { 91 | adaptResolution(); 92 | } 93 | } 94 | 95 | protected virtual void adaptResolution() { 96 | int width = texture.width; 97 | int height = texture.height; 98 | Vector3 adaptReso = oriScale; 99 | adaptReso.x *= ((float) width / height); 100 | transform.localScale = adaptReso; 101 | } 102 | } 103 | } -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/ImageSourceController.cs.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 8b8c17db3671a9549b163145fa2a9629 3 | timeCreated: 1464268433 4 | licenseType: Store 5 | MonoImporter: 6 | serializedVersion: 2 7 | defaultReferences: [] 8 | executionOrder: 0 9 | icon: {instanceID: 0} 10 | userData: 11 | assetBundleName: 12 | assetBundleVariant: 13 | -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/StereoHandler.cs: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | using UnityEngine; 4 | using System.Collections; 5 | 6 | namespace HTC.UnityPlugin.Multimedia 7 | { 8 | public static class StereoHandler { 9 | // Stereo members 10 | public enum StereoType {SIDE_BY_SIDE, TOP_DOWN}; 11 | public static void SetStereoPair(StereoProperty stereoProperty, Material material) { 12 | SetStereoPair(stereoProperty.left, stereoProperty.right, material, stereoProperty.stereoType, stereoProperty.isLeftFirst); 13 | } 14 | public static void SetStereoPair(GameObject left, GameObject right, Material material, StereoType stereoType, bool isLeftFirst) { 15 | if (left == null || right == null) { 16 | Debug.Log("Stereo targets are null."); 17 | return; 18 | } 19 | 20 | // 0.Get left frame and right frame UV 21 | MeshFilter leftMesh = left.GetComponent(); 22 | MeshFilter rightMesh = right.GetComponent(); 23 | 24 | if (leftMesh != null && rightMesh != null) { 25 | Vector2[] leftUV = leftMesh.mesh.uv; 26 | Vector2[] rightUV = rightMesh.mesh.uv; 27 | 28 | // 1.Modify UV 29 | for (int i = 0; i < leftUV.Length; i++) { 30 | if (stereoType == StereoType.SIDE_BY_SIDE) { 31 | leftUV[i].x = leftUV[i].x / 2; 32 | rightUV[i].x = rightUV[i].x / 2; 33 | if (isLeftFirst) { 34 | rightUV[i].x += 0.5f; 35 | } else { 36 | leftUV[i].x += 0.5f; 37 | } 38 | } else if (stereoType == StereoType.TOP_DOWN) { 39 | leftUV[i].y = leftUV[i].y / 2; 40 | rightUV[i].y = rightUV[i].y / 2; 41 | if (!isLeftFirst) { // For inverse uv, the order is inversed too. 42 | rightUV[i].y += 0.5f; 43 | } else { 44 | leftUV[i].y += 0.5f; 45 | } 46 | } 47 | } 48 | 49 | leftMesh.mesh.uv = leftUV; 50 | rightMesh.mesh.uv = rightUV; 51 | } 52 | 53 | // 2.Assign texture 54 | left.GetComponent().material = material; 55 | right.GetComponent().material = material; 56 | } 57 | } 58 | } -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/StereoHandler.cs.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 639f6c1882a390046935a995d1b2aea4 3 | timeCreated: 1436338897 4 | licenseType: Store 5 | MonoImporter: 6 | serializedVersion: 2 7 | defaultReferences: [] 8 | executionOrder: 0 9 | icon: {instanceID: 0} 10 | userData: 11 | assetBundleName: 12 | assetBundleVariant: 13 | -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/StereoImageSourceController.cs: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | using UnityEngine; 4 | using System.Collections; 5 | 6 | namespace HTC.UnityPlugin.Multimedia 7 | { 8 | public class StereoImageSourceController : ImageSourceController { 9 | public StereoProperty stereoProperty; 10 | 11 | protected override void Start() { 12 | LOG_TAG = "[StereoImageSourceController]"; 13 | base.Start(); 14 | StereoHandler.SetStereoPair(stereoProperty, GetComponent().material); 15 | } 16 | 17 | protected override void adaptResolution() { 18 | int width = texture.width; 19 | int height = texture.height; 20 | if (stereoProperty.stereoType == StereoHandler.StereoType.TOP_DOWN) { 21 | height /= 2; 22 | } else { 23 | width /= 2; 24 | } 25 | Vector3 adaptReso = oriScale; 26 | adaptReso.x *= ((float) width / height); 27 | transform.localScale = adaptReso; 28 | } 29 | } 30 | } -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/StereoImageSourceController.cs.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: a921f0f583d0b73488a78ac65c7f3a9d 3 | timeCreated: 1464316915 4 | licenseType: Store 5 | MonoImporter: 6 | serializedVersion: 2 7 | defaultReferences: [] 8 | executionOrder: 0 9 | icon: {instanceID: 0} 10 | userData: 11 | assetBundleName: 12 | assetBundleVariant: 13 | -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/StereoProperty.cs: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | using UnityEngine; 4 | 5 | namespace HTC.UnityPlugin.Multimedia 6 | { 7 | [System.Serializable] 8 | public class StereoProperty { 9 | public bool isLeftFirst; 10 | public StereoHandler.StereoType stereoType = StereoHandler.StereoType.TOP_DOWN; 11 | 12 | public GameObject left; 13 | public GameObject right; 14 | } 15 | } -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/StereoProperty.cs.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 3584b664adc887d478326be51ee3ee20 3 | timeCreated: 1464317417 4 | licenseType: Store 5 | MonoImporter: 6 | serializedVersion: 2 7 | defaultReferences: [] 8 | executionOrder: 0 9 | icon: {instanceID: 0} 10 | userData: 11 | assetBundleName: 12 | assetBundleVariant: 13 | -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/StereoVideoSourceController.cs: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | using UnityEngine; 4 | using System.Collections; 5 | 6 | namespace HTC.UnityPlugin.Multimedia 7 | { 8 | public class StereoVideoSourceController : VideoSourceController { 9 | public StereoProperty stereoProperty; 10 | 11 | // Use this for initialization 12 | protected override void Start () { 13 | LOG_TAG = "[StereoVideoSourceController]"; 14 | base.Start(); 15 | StereoHandler.SetStereoPair(stereoProperty, GetComponent().material); 16 | } 17 | 18 | protected override void adaptResolution() { 19 | int width = 1; 20 | int height = 1; 21 | decoder.getVideoResolution(ref width, ref height); 22 | if (stereoProperty.stereoType == StereoHandler.StereoType.TOP_DOWN) { 23 | height /= 2; 24 | } else { 25 | width /= 2; 26 | } 27 | Vector3 adaptReso = oriScale; 28 | adaptReso.x *= ((float) width / height); 29 | transform.localScale = adaptReso; 30 | } 31 | } 32 | } -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/StereoVideoSourceController.cs.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 21b1ac30ce9f64e4dbc05581c22819f1 3 | timeCreated: 1464319285 4 | licenseType: Store 5 | MonoImporter: 6 | serializedVersion: 2 7 | defaultReferences: [] 8 | executionOrder: 0 9 | icon: {instanceID: 0} 10 | userData: 11 | assetBundleName: 12 | assetBundleVariant: 13 | -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/VideoSourceController.cs: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | using UnityEngine; 4 | using UnityEngine.Events; 5 | using System.Collections; 6 | 7 | namespace HTC.UnityPlugin.Multimedia 8 | { 9 | [RequireComponent(typeof(ViveMediaDecoder))] 10 | public class VideoSourceController : MonoBehaviour { 11 | protected string LOG_TAG = "[VideoSourceController]"; 12 | 13 | public string folderPath; 14 | public string filter; 15 | public bool isAdaptToResolution; 16 | public UnityEvent onInitComplete; 17 | public UnityEvent onChangeVideo; 18 | protected bool isInitialized = false; 19 | protected FileSeeker fileSeeker; 20 | 21 | protected ViveMediaDecoder decoder; 22 | protected Vector3 oriScale; 23 | 24 | protected virtual void Start () { 25 | decoder = GetComponent(); 26 | initFileSeeker(); 27 | } 28 | 29 | public void initFileSeeker() { 30 | if (folderPath == null) { 31 | Debug.Log(LOG_TAG + "Folder path is null."); 32 | return; 33 | } 34 | 35 | isInitialized = false; 36 | 37 | fileSeeker = new FileSeeker(); 38 | if (!fileSeeker.loadFolder(folderPath, filter)) { 39 | Debug.Log(LOG_TAG + " content not found."); 40 | fileSeeker = null; 41 | return; 42 | } 43 | 44 | oriScale = transform.localScale; 45 | 46 | isInitialized = true; 47 | 48 | onInitComplete.Invoke(); 49 | } 50 | 51 | public void startVideoPlay() { 52 | if (!isInitialized) { 53 | Debug.Log(LOG_TAG + " not initialized."); 54 | return; 55 | } 56 | 57 | if (isAdaptToResolution) { 58 | decoder.onInitComplete.AddListener(adaptResolution); 59 | } 60 | decoder.onInitComplete.AddListener(decoder.startDecoding); 61 | decoder.onInitComplete.AddListener(decoder.onInitComplete.RemoveAllListeners); 62 | decoder.initDecoder(fileSeeker.getPath()); 63 | } 64 | 65 | public void nextVideo() { 66 | if (!isInitialized) { 67 | Debug.Log(LOG_TAG + " not initialized."); 68 | return; 69 | } 70 | 71 | decoder.stopDecoding(); 72 | fileSeeker.toNext(); 73 | 74 | onChangeVideo.Invoke(); 75 | } 76 | 77 | public void prevVideo() { 78 | if (!isInitialized) { 79 | Debug.Log(LOG_TAG + " not initialized."); 80 | return; 81 | } 82 | 83 | decoder.stopDecoding(); 84 | fileSeeker.toPrev(); 85 | 86 | onChangeVideo.Invoke(); 87 | } 88 | 89 | public void stopVideo() { 90 | if (!isInitialized) { 91 | Debug.Log(LOG_TAG + " not initialized."); 92 | return; 93 | } 94 | 95 | decoder.stopDecoding(); 96 | } 97 | 98 | protected virtual void adaptResolution() { 99 | int width = 1; 100 | int height = 1; 101 | decoder.getVideoResolution(ref width, ref height); 102 | Vector3 adaptReso = oriScale; 103 | adaptReso.x *= ((float) width / height); 104 | transform.localScale = adaptReso; 105 | } 106 | } 107 | } -------------------------------------------------------------------------------- /UnityPackage/Scripts/PlayerScripts/VideoSourceController.cs.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 769340f657b7ade4487e56cde526b1ab 3 | timeCreated: 1464264987 4 | licenseType: Store 5 | MonoImporter: 6 | serializedVersion: 2 7 | defaultReferences: [] 8 | executionOrder: 0 9 | icon: {instanceID: 0} 10 | userData: 11 | assetBundleName: 12 | assetBundleVariant: 13 | -------------------------------------------------------------------------------- /UnityPackage/Scripts/UVSphere.cs: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | using UnityEngine; 4 | using System.Collections; 5 | 6 | namespace HTC.UnityPlugin.Multimedia 7 | { 8 | [RequireComponent(typeof(MeshFilter), typeof(MeshRenderer))] 9 | public class UVSphere : MonoBehaviour { 10 | public enum FrontFaceType { Inside, Outside }; 11 | public enum TextureOrigin { BottomLeft, TopLeft }; 12 | public int Latitude = 32; 13 | public int Longitude = 32; 14 | public float Radius = 1.0f; 15 | public FrontFaceType frontFaceType = FrontFaceType.Outside; 16 | public TextureOrigin textureOrigin = TextureOrigin.BottomLeft; 17 | 18 | // Use this for initialization 19 | void Awake () { 20 | GenerateUVSphere(Latitude, Longitude, Radius); 21 | } 22 | 23 | // This function is amid to generate a UV sphere which top and bottom is multiple vertices. 24 | void GenerateUVSphere(int latNum, int longNum, float radius) { 25 | int vertexNum = (longNum + 1) * (latNum + 2); 26 | int meshNum = (longNum) * (latNum + 1); 27 | int triangleNum = meshNum * 2; 28 | 29 | // 1.Calculate vertices 30 | Vector3[] vertices = new Vector3[vertexNum]; 31 | float PI = Mathf.PI; 32 | float PI2 = PI * 2.0f; 33 | int latIdxMax = latNum + 1; // Latitude vertex number is latNum + 2, index numbers are from 0 ~ latNum + 1 34 | int longVertNum = longNum + 1; // Longitude vertex number is longNum + 1, index numbers are from 0 ~ longNum 35 | float preComputeV = PI / latIdxMax; 36 | float preComputeH = PI2 / longNum; 37 | for (int i = 0; i <= latIdxMax; i++) { 38 | float thetaV = i * preComputeV; // PI * i / latIdxMax; 39 | float sinV = Mathf.Sin(thetaV); 40 | float cosV = Mathf.Cos(thetaV); 41 | int lineStartIdx = i * longVertNum; 42 | for (int j = 0; j <= longNum; j++) { 43 | float thetaH = j * preComputeH; // PI2 * j / longNum; 44 | vertices[lineStartIdx + j] = new Vector3( 45 | Mathf.Cos(thetaH) * sinV, 46 | cosV, 47 | Mathf.Sin(thetaH) * sinV 48 | ) * radius; 49 | } 50 | } 51 | 52 | // 2.Calculate normals 53 | Vector3[] normals = new Vector3[vertices.Length]; 54 | for (int i = 0; i < vertices.Length; i++) { 55 | normals[i] = vertices[i].normalized; 56 | if (frontFaceType == FrontFaceType.Inside) { 57 | normals[i] *= -1.0f; 58 | } 59 | } 60 | 61 | // 3.Calculate uvs 62 | Vector2[] uvs = new Vector2[vertices.Length]; 63 | for (int i = 0; i <= latIdxMax; i++) { 64 | int lineStartIdx = i * longVertNum; 65 | float vVal = (float) i / latIdxMax; 66 | if (textureOrigin == TextureOrigin.BottomLeft) { 67 | vVal = 1.0f - vVal; 68 | } 69 | for (int j = 0; j <= longNum; j++) { 70 | float uVal = (float) j / longNum; 71 | if (frontFaceType == FrontFaceType.Inside) { 72 | uVal = 1.0f - uVal; 73 | } 74 | uvs[lineStartIdx + j] = new Vector2(uVal, vVal); 75 | } 76 | } 77 | 78 | // 4.Calculate triangles 79 | int[] triangles = new int[triangleNum * 3]; 80 | int index = 0; 81 | for (int i = 0; i <= latNum; i++) { 82 | for (int j = 0; j < longNum; j++) { 83 | int curVertIdx = i * longVertNum + j; 84 | int nextLineVertIdx = curVertIdx + longVertNum; 85 | 86 | if (frontFaceType == FrontFaceType.Outside) { 87 | triangles[index++] = curVertIdx; 88 | triangles[index++] = curVertIdx + 1; 89 | triangles[index++] = nextLineVertIdx + 1; 90 | triangles[index++] = curVertIdx; 91 | triangles[index++] = nextLineVertIdx + 1; 92 | triangles[index++] = nextLineVertIdx; 93 | } else { 94 | triangles[index++] = curVertIdx; 95 | triangles[index++] = nextLineVertIdx + 1; 96 | triangles[index++] = curVertIdx + 1; 97 | triangles[index++] = curVertIdx; 98 | triangles[index++] = nextLineVertIdx; 99 | triangles[index++] = nextLineVertIdx + 1; 100 | } 101 | } 102 | } 103 | 104 | // 5.Assign to mesh 105 | MeshFilter filter = gameObject.GetComponent(); 106 | Mesh mesh = filter.mesh; 107 | mesh.Clear(); 108 | mesh.vertices = vertices; 109 | mesh.normals = normals; 110 | mesh.uv = uvs; 111 | mesh.triangles = triangles; 112 | ; 113 | MeshRenderer renderer = gameObject.GetComponent(); 114 | renderer.material.mainTexture = Texture2D.blackTexture; 115 | } 116 | } 117 | } -------------------------------------------------------------------------------- /UnityPackage/Scripts/UVSphere.cs.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 09443a9a5fbe0bb488c9506d4c82cfab 3 | timeCreated: 1454065786 4 | licenseType: Store 5 | MonoImporter: 6 | serializedVersion: 2 7 | defaultReferences: [] 8 | executionOrder: 0 9 | icon: {instanceID: 0} 10 | userData: 11 | assetBundleName: 12 | assetBundleVariant: 13 | -------------------------------------------------------------------------------- /UnityPackage/Scripts/ViveMediaDecoder.cs: -------------------------------------------------------------------------------- 1 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 2 | 3 | using UnityEngine; 4 | using UnityEngine.Events; 5 | using System; 6 | using System.Collections; 7 | using System.Collections.Generic; 8 | using System.Runtime.InteropServices; 9 | using System.ComponentModel; 10 | 11 | namespace HTC.UnityPlugin.Multimedia 12 | { 13 | [RequireComponent(typeof(MeshRenderer))] 14 | public class ViveMediaDecoder : MonoBehaviour 15 | { 16 | private const string NATIVE_LIBRARY_NAME = "ViveMediaDecoder"; 17 | 18 | // Decoder 19 | [DllImport(NATIVE_LIBRARY_NAME)] 20 | private static extern int nativeCreateDecoder(string filePath, ref int id); 21 | 22 | [DllImport(NATIVE_LIBRARY_NAME)] 23 | private static extern int nativeCreateDecoderAsync(string filePath, ref int id); 24 | 25 | [DllImport(NATIVE_LIBRARY_NAME)] 26 | private static extern int nativeGetDecoderState(int id); 27 | 28 | [DllImport(NATIVE_LIBRARY_NAME)] 29 | private static extern void nativeCreateTexture(int id, ref IntPtr tex0, ref IntPtr tex1, ref IntPtr tex2); 30 | 31 | [DllImport(NATIVE_LIBRARY_NAME)] 32 | private static extern bool nativeStartDecoding(int id); 33 | 34 | [DllImport(NATIVE_LIBRARY_NAME)] 35 | private static extern void nativeDestroyDecoder(int id); 36 | 37 | [DllImport(NATIVE_LIBRARY_NAME)] 38 | private static extern bool nativeIsVideoBufferFull(int id); 39 | 40 | [DllImport(NATIVE_LIBRARY_NAME)] 41 | private static extern bool nativeIsVideoBufferEmpty(int id); 42 | 43 | [DllImport(NATIVE_LIBRARY_NAME)] 44 | private static extern bool nativeIsEOF(int id); 45 | 46 | // Video 47 | [DllImport(NATIVE_LIBRARY_NAME)] 48 | private static extern bool nativeIsVideoEnabled(int id); 49 | 50 | [DllImport(NATIVE_LIBRARY_NAME)] 51 | private static extern void nativeSetVideoEnable(int id, bool isEnable); 52 | 53 | [DllImport(NATIVE_LIBRARY_NAME)] 54 | private static extern void nativeGetVideoFormat(int id, ref int width, ref int height, ref float totalTime); 55 | 56 | [DllImport(NATIVE_LIBRARY_NAME)] 57 | private static extern void nativeSetVideoTime(int id, float currentTime); 58 | 59 | [DllImport(NATIVE_LIBRARY_NAME)] 60 | private static extern bool nativeIsContentReady(int id); 61 | 62 | // Audio 63 | [DllImport(NATIVE_LIBRARY_NAME)] 64 | private static extern bool nativeIsAudioEnabled(int id); 65 | 66 | [DllImport(NATIVE_LIBRARY_NAME)] 67 | private static extern void nativeSetAudioEnable(int id, bool isEnable); 68 | 69 | [DllImport(NATIVE_LIBRARY_NAME)] 70 | private static extern void nativeSetAudioAllChDataEnable(int id, bool isEnable); 71 | 72 | [DllImport(NATIVE_LIBRARY_NAME)] 73 | private static extern void nativeGetAudioFormat(int id, ref int channel, ref int frequency, ref float totalTime); 74 | 75 | [DllImport(NATIVE_LIBRARY_NAME)] 76 | private static extern float nativeGetAudioData(int id, ref IntPtr output, ref int lengthPerChannel); 77 | 78 | [DllImport(NATIVE_LIBRARY_NAME)] 79 | private static extern void nativeFreeAudioData(int id); 80 | 81 | // Seek 82 | [DllImport(NATIVE_LIBRARY_NAME)] 83 | private static extern void nativeSetSeekTime(int id, float sec); 84 | 85 | [DllImport(NATIVE_LIBRARY_NAME)] 86 | private static extern bool nativeIsSeekOver(int id); 87 | 88 | // Utility 89 | [DllImport (NATIVE_LIBRARY_NAME)] 90 | private static extern int nativeGetMetaData(string filePath, out IntPtr key, out IntPtr value); 91 | 92 | [DllImport (NATIVE_LIBRARY_NAME)] 93 | private static extern void nativeLoadThumbnail(int id, float time, IntPtr texY, IntPtr texU, IntPtr texV); 94 | 95 | // Render event 96 | [DllImport (NATIVE_LIBRARY_NAME)] 97 | private static extern IntPtr GetRenderEventFunc(); 98 | 99 | private const string VERSION = "1.1.7.190215"; 100 | public bool playOnAwake = false; 101 | public string mediaPath = null; // Assigned outside. 102 | public UnityEvent onInitComplete = null; // Initialization is asynchronized. Invoked after initialization. 103 | public UnityEvent onVideoEnd = null; // Invoked on video end. 104 | 105 | public enum DecoderState { 106 | INIT_FAIL = -2, 107 | STOP, 108 | NOT_INITIALIZED, 109 | INITIALIZING, 110 | INITIALIZED, 111 | START, 112 | PAUSE, 113 | SEEK_FRAME, 114 | BUFFERING, 115 | EOF 116 | }; 117 | private DecoderState lastState = DecoderState.NOT_INITIALIZED; 118 | private DecoderState decoderState = DecoderState.NOT_INITIALIZED; 119 | private int decoderID = -1; 120 | 121 | private const string LOG_TAG = "[ViveMediaDecoder]"; 122 | 123 | public bool isVideoEnabled { get; private set; } 124 | public bool isAudioEnabled { get; private set; } 125 | private bool isAllAudioChEnabled = false; 126 | 127 | private bool useDefault = true; // To set default texture before video initialized. 128 | private bool seekPreview = false; // To preview first frame of seeking when seek under paused state. 129 | private Texture2D videoTexYch = null; 130 | private Texture2D videoTexUch = null; 131 | private Texture2D videoTexVch = null; 132 | private int videoWidth = -1; 133 | private int videoHeight = -1; 134 | 135 | private const int AUDIO_FRAME_SIZE = 2048; // Audio clip data size. Packed from audioDataBuff. 136 | private const int SWAP_BUFFER_NUM = 4; // How many audio source to swap. 137 | private AudioSource[] audioSource = new AudioSource[SWAP_BUFFER_NUM]; 138 | private List audioDataBuff = null; // Buffer to keep audio data decoded from native. 139 | public int audioFrequency { get; private set; } 140 | public int audioChannels { get; private set; } 141 | private const double OVERLAP_TIME = 0.02; // Our audio clip is defined as: [overlay][audio data][overlap]. 142 | private int audioOverlapLength = 0; // OVERLAP_TIME * audioFrequency. 143 | private int audioDataLength = 0; // (AUDIO_FRAME_SIZE + 2 * audioOverlapLength) * audioChannel. 144 | private float volume = 1.0f; 145 | 146 | // Time control 147 | private double globalStartTime = 0; // Video and audio progress are based on this start time. 148 | private bool isVideoReadyToReplay = false; 149 | private bool isAudioReadyToReplay = false; 150 | private double audioProgressTime = -1.0; 151 | private double hangTime = -1.0f; // Used to set progress time after seek/resume. 152 | private double firstAudioFrameTime = -1.0; 153 | public float videoTotalTime { get; private set; } // Video duration. 154 | public float audioTotalTime { get; private set; } // Audio duration. 155 | 156 | private BackgroundWorker backgroundWorker; 157 | private object _lock = new object(); 158 | 159 | void Awake () { 160 | print(LOG_TAG + " ver." + VERSION); 161 | if (playOnAwake) { 162 | print (LOG_TAG + " play on wake."); 163 | onInitComplete.AddListener(startDecoding); 164 | initDecoder(mediaPath); 165 | } 166 | } 167 | 168 | // Video progress is triggered using Update. Progress time would be set by nativeSetVideoTime. 169 | void Update() { 170 | switch (decoderState) { 171 | case DecoderState.START: 172 | if (isVideoEnabled) { 173 | // Prevent empty texture generate green screen.(default 0,0,0 in YUV which is green in RGB) 174 | if (useDefault && nativeIsContentReady(decoderID)) { 175 | getTextureFromNative(); 176 | setTextures(videoTexYch, videoTexUch, videoTexVch); 177 | useDefault = false; 178 | } 179 | 180 | // Update video frame by dspTime. 181 | double setTime = AudioSettings.dspTime - globalStartTime; 182 | 183 | // Normal update frame. 184 | if (setTime < videoTotalTime || videoTotalTime == -1.0f) { 185 | if (seekPreview && nativeIsContentReady(decoderID)) { 186 | setPause(); 187 | seekPreview = false; 188 | unmute(); 189 | } else { 190 | nativeSetVideoTime(decoderID, (float) setTime); 191 | GL.IssuePluginEvent(GetRenderEventFunc(), decoderID); 192 | } 193 | } else { 194 | if (!nativeIsVideoBufferEmpty(decoderID)) { 195 | nativeSetVideoTime(decoderID, (float)setTime); 196 | GL.IssuePluginEvent(GetRenderEventFunc(), decoderID); 197 | } else { 198 | isVideoReadyToReplay = true; 199 | } 200 | } 201 | } 202 | 203 | if (nativeIsVideoBufferEmpty(decoderID) && !nativeIsEOF(decoderID)) { 204 | decoderState = DecoderState.BUFFERING; 205 | hangTime = AudioSettings.dspTime - globalStartTime; 206 | } 207 | 208 | break; 209 | 210 | case DecoderState.SEEK_FRAME: 211 | if (nativeIsSeekOver(decoderID)) { 212 | globalStartTime = AudioSettings.dspTime - hangTime; 213 | decoderState = DecoderState.START; 214 | if (lastState == DecoderState.PAUSE) { 215 | seekPreview = true; 216 | mute(); 217 | } 218 | } 219 | break; 220 | 221 | case DecoderState.BUFFERING: 222 | if (nativeIsVideoBufferFull(decoderID) || nativeIsEOF(decoderID)) { 223 | decoderState = DecoderState.START; 224 | globalStartTime = AudioSettings.dspTime - hangTime; 225 | } 226 | break; 227 | 228 | case DecoderState.PAUSE: 229 | case DecoderState.EOF: 230 | default: 231 | break; 232 | } 233 | 234 | if (isVideoEnabled || isAudioEnabled) { 235 | if ((!isVideoEnabled || isVideoReadyToReplay) && (!isAudioEnabled || isAllAudioChEnabled || isAudioReadyToReplay)) { 236 | decoderState = DecoderState.EOF; 237 | isVideoReadyToReplay = isAudioReadyToReplay = false; 238 | 239 | if (onVideoEnd != null) { 240 | onVideoEnd.Invoke(); 241 | } 242 | } 243 | } 244 | } 245 | 246 | public void initDecoder(string path, bool enableAllAudioCh = false) { 247 | isAllAudioChEnabled = enableAllAudioCh; 248 | StartCoroutine(initDecoderAsync(path)); 249 | } 250 | 251 | IEnumerator initDecoderAsync(string path) { 252 | print(LOG_TAG + " init Decoder."); 253 | decoderState = DecoderState.INITIALIZING; 254 | 255 | mediaPath = path; 256 | decoderID = -1; 257 | nativeCreateDecoderAsync(mediaPath, ref decoderID); 258 | 259 | int result = 0; 260 | do { 261 | yield return null; 262 | result = nativeGetDecoderState(decoderID); 263 | } while (!(result == 1 || result == -1)); 264 | 265 | // Init success. 266 | if (result == 1) { 267 | print(LOG_TAG + " Init success."); 268 | isVideoEnabled = nativeIsVideoEnabled(decoderID); 269 | if (isVideoEnabled) { 270 | float duration = 0.0f; 271 | nativeGetVideoFormat(decoderID, ref videoWidth, ref videoHeight, ref duration); 272 | videoTotalTime = duration > 0 ? duration : -1.0f ; 273 | print(LOG_TAG + " Video format: (" + videoWidth + ", " + videoHeight + ")"); 274 | print(LOG_TAG + " Total time: " + videoTotalTime); 275 | 276 | setTextures(null, null, null); 277 | useDefault = true; 278 | } 279 | 280 | // Initialize audio. 281 | isAudioEnabled = nativeIsAudioEnabled(decoderID); 282 | print(LOG_TAG + " isAudioEnabled = " + isAudioEnabled); 283 | if (isAudioEnabled) { 284 | if (isAllAudioChEnabled) { 285 | nativeSetAudioAllChDataEnable(decoderID, isAllAudioChEnabled); 286 | getAudioFormat(); 287 | } else { 288 | getAudioFormat(); 289 | initAudioSource(); 290 | } 291 | } 292 | 293 | decoderState = DecoderState.INITIALIZED; 294 | 295 | if (onInitComplete != null) { 296 | onInitComplete.Invoke(); 297 | } 298 | } else { 299 | print(LOG_TAG + " Init fail."); 300 | decoderState = DecoderState.INIT_FAIL; 301 | } 302 | } 303 | 304 | private void getAudioFormat() { 305 | int channels = 0; 306 | int freqency = 0; 307 | float duration = 0.0f; 308 | nativeGetAudioFormat(decoderID, ref channels, ref freqency, ref duration); 309 | audioChannels = channels; 310 | audioFrequency = freqency; 311 | audioTotalTime = duration > 0 ? duration : -1.0f; 312 | print(LOG_TAG + " audioChannel " + audioChannels); 313 | print(LOG_TAG + " audioFrequency " + audioFrequency); 314 | print(LOG_TAG + " audioTotalTime " + audioTotalTime); 315 | } 316 | 317 | private void initAudioSource() { 318 | getAudioFormat(); 319 | audioOverlapLength = (int) (OVERLAP_TIME * audioFrequency + 0.5f); 320 | 321 | audioDataLength = (AUDIO_FRAME_SIZE + 2 * audioOverlapLength) * audioChannels; 322 | for (int i = 0; i < SWAP_BUFFER_NUM; i++) { 323 | if (audioSource[i] == null) { 324 | audioSource[i] = gameObject.AddComponent(); 325 | } 326 | audioSource[i].clip = AudioClip.Create("testSound" + i, audioDataLength, audioChannels, audioFrequency, false); 327 | audioSource[i].playOnAwake = false; 328 | audioSource[i].volume = volume; 329 | audioSource[i].minDistance = audioSource[i].maxDistance; 330 | } 331 | } 332 | 333 | public void startDecoding() { 334 | if(decoderState == DecoderState.INITIALIZED) { 335 | if (!nativeStartDecoding(decoderID)) { 336 | print (LOG_TAG + " Decoding not start."); 337 | return; 338 | } 339 | 340 | decoderState = DecoderState.BUFFERING; 341 | globalStartTime = AudioSettings.dspTime; 342 | hangTime = AudioSettings.dspTime - globalStartTime; 343 | 344 | isVideoReadyToReplay = isAudioReadyToReplay = false; 345 | if(isAudioEnabled && !isAllAudioChEnabled) { 346 | StartCoroutine ("audioPlay"); 347 | backgroundWorker = new BackgroundWorker(); 348 | backgroundWorker.WorkerSupportsCancellation = true; 349 | backgroundWorker.DoWork += new DoWorkEventHandler(pullAudioData); 350 | backgroundWorker.RunWorkerAsync(); 351 | } 352 | } 353 | } 354 | 355 | private void pullAudioData(object sender, DoWorkEventArgs e) { 356 | IntPtr dataPtr = IntPtr.Zero; // Pointer to get audio data from native. 357 | float[] tempBuff = new float[0]; // Buffer to copy audio data from dataPtr to audioDataBuff. 358 | int audioFrameLength = 0; 359 | double lastTime = -1.0f; // Avoid to schedule the same audio data set. 360 | 361 | audioDataBuff = new List(); 362 | while (decoderState >= DecoderState.START) { 363 | if (decoderState != DecoderState.SEEK_FRAME) { 364 | double audioNativeTime = nativeGetAudioData(decoderID, ref dataPtr, ref audioFrameLength); 365 | if (0 < audioNativeTime && lastTime != audioNativeTime && decoderState != DecoderState.SEEK_FRAME && audioFrameLength != 0) { 366 | if (firstAudioFrameTime == -1.0) { 367 | firstAudioFrameTime = audioNativeTime; 368 | } 369 | 370 | lastTime = audioNativeTime; 371 | audioFrameLength *= audioChannels; 372 | if (tempBuff.Length != audioFrameLength) { 373 | // For dynamic audio data length, reallocate the memory if needed. 374 | tempBuff = new float[audioFrameLength]; 375 | } 376 | Marshal.Copy(dataPtr, tempBuff, 0, audioFrameLength); 377 | lock (_lock) { 378 | audioDataBuff.AddRange(tempBuff); 379 | } 380 | } 381 | 382 | if (audioNativeTime != -1.0) { 383 | nativeFreeAudioData(decoderID); 384 | } 385 | 386 | System.Threading.Thread.Sleep(2); 387 | } 388 | } 389 | 390 | lock (_lock) { 391 | audioDataBuff.Clear(); 392 | audioDataBuff = null; 393 | } 394 | } 395 | 396 | private void getTextureFromNative() { 397 | ReleaseTexture(); 398 | 399 | IntPtr nativeTexturePtrY = new IntPtr(); 400 | IntPtr nativeTexturePtrU = new IntPtr(); 401 | IntPtr nativeTexturePtrV = new IntPtr(); 402 | nativeCreateTexture(decoderID, ref nativeTexturePtrY, ref nativeTexturePtrU, ref nativeTexturePtrV); 403 | videoTexYch = Texture2D.CreateExternalTexture( 404 | videoWidth, videoHeight, TextureFormat.Alpha8, false, false, nativeTexturePtrY); 405 | videoTexUch = Texture2D.CreateExternalTexture( 406 | videoWidth / 2, videoHeight / 2, TextureFormat.Alpha8, false, false, nativeTexturePtrU); 407 | videoTexVch = Texture2D.CreateExternalTexture( 408 | videoWidth / 2, videoHeight / 2, TextureFormat.Alpha8, false, false, nativeTexturePtrV); 409 | } 410 | 411 | private void ReleaseTexture() { 412 | setTextures(null, null, null); 413 | 414 | videoTexYch = null; 415 | videoTexUch = null; 416 | videoTexVch = null; 417 | 418 | useDefault = true; 419 | } 420 | 421 | private void setTextures(Texture ytex, Texture utex, Texture vtex) { 422 | Material texMaterial = GetComponent().material; 423 | texMaterial.SetTexture("_YTex", ytex); 424 | texMaterial.SetTexture("_UTex", utex); 425 | texMaterial.SetTexture("_VTex", vtex); 426 | } 427 | 428 | public void replay() { 429 | if (setSeekTime(0.0f)) { 430 | globalStartTime = AudioSettings.dspTime; 431 | isVideoReadyToReplay = isAudioReadyToReplay = false; 432 | } 433 | } 434 | 435 | public void getAllAudioChannelData(out float[] data, out double time, out int samplesPerChannel) { 436 | if (!isAllAudioChEnabled) { 437 | print(LOG_TAG + " this function only works for isAllAudioEnabled == true."); 438 | data = null; 439 | time = 0; 440 | samplesPerChannel = 0; 441 | return; 442 | } 443 | 444 | IntPtr dataPtr = new IntPtr(); 445 | int lengthPerChannel = 0; 446 | double audioNativeTime = nativeGetAudioData(decoderID, ref dataPtr, ref lengthPerChannel); 447 | float[] buff = null; 448 | if (lengthPerChannel > 0) { 449 | buff = new float[lengthPerChannel * audioChannels]; 450 | Marshal.Copy(dataPtr, buff, 0, buff.Length); 451 | nativeFreeAudioData(decoderID); 452 | } 453 | 454 | data = buff; 455 | time = audioNativeTime; 456 | samplesPerChannel = lengthPerChannel; 457 | } 458 | 459 | IEnumerator audioPlay() { 460 | print (LOG_TAG + " start audio play coroutine."); 461 | int swapIndex = 0; // Swap between audio sources. 462 | double audioDataTime = (double) AUDIO_FRAME_SIZE / audioFrequency; 463 | int playedAudioDataLength = AUDIO_FRAME_SIZE * audioChannels; // Data length exclude the overlap length. 464 | 465 | print(LOG_TAG + " audioDataTime " + audioDataTime); 466 | 467 | audioProgressTime = -1.0; // Used to schedule each audio clip to be played. 468 | while(decoderState >= DecoderState.START) { 469 | if(decoderState == DecoderState.START) { 470 | double currentTime = AudioSettings.dspTime - globalStartTime; 471 | if(currentTime < audioTotalTime || audioTotalTime == -1.0f) { 472 | if (audioDataBuff != null && audioDataBuff.Count >= audioDataLength) { 473 | if (audioProgressTime == -1.0) { 474 | // To simplify, the first overlap data would not be played. 475 | // Correct the audio progress time by adding OVERLAP_TIME. 476 | audioProgressTime = firstAudioFrameTime + OVERLAP_TIME; 477 | globalStartTime = AudioSettings.dspTime - audioProgressTime; 478 | } 479 | 480 | while (audioSource[swapIndex].isPlaying || decoderState == DecoderState.SEEK_FRAME) { yield return null; } 481 | 482 | // Re-check data length if audioDataBuff is cleared by seek. 483 | if (audioDataBuff.Count >= audioDataLength) { 484 | double playTime = audioProgressTime + globalStartTime; 485 | double endTime = playTime + audioDataTime; 486 | 487 | // If audio is late, adjust start time and re-calculate audio clip time. 488 | if (playTime <= AudioSettings.dspTime) { 489 | globalStartTime = AudioSettings.dspTime - audioProgressTime; 490 | playTime = audioProgressTime + globalStartTime; 491 | endTime = playTime + audioDataTime; 492 | } 493 | 494 | audioSource[swapIndex].clip.SetData(audioDataBuff.GetRange(0, audioDataLength).ToArray(), 0); 495 | audioSource[swapIndex].PlayScheduled(playTime); 496 | audioSource[swapIndex].SetScheduledEndTime(endTime); 497 | audioSource[swapIndex].time = (float) OVERLAP_TIME; 498 | audioProgressTime += audioDataTime; 499 | swapIndex = (swapIndex + 1) % SWAP_BUFFER_NUM; 500 | 501 | lock (_lock) { 502 | audioDataBuff.RemoveRange(0, playedAudioDataLength); 503 | } 504 | } 505 | } 506 | } else { 507 | //print(LOG_TAG + " Audio reach EOF. Prepare replay."); 508 | isAudioReadyToReplay = true; 509 | audioProgressTime = firstAudioFrameTime = - 1.0; 510 | if (audioDataBuff != null) { 511 | lock (_lock) { 512 | audioDataBuff.Clear(); 513 | } 514 | } 515 | } 516 | } 517 | yield return new WaitForFixedUpdate(); 518 | } 519 | } 520 | 521 | public void stopDecoding() { 522 | if(decoderState >= DecoderState.INITIALIZING) { 523 | print (LOG_TAG + " stop decoding."); 524 | decoderState = DecoderState.STOP; 525 | ReleaseTexture(); 526 | if (isAudioEnabled && !isAllAudioChEnabled) { 527 | StopCoroutine ("audioPlay"); 528 | backgroundWorker.CancelAsync(); 529 | 530 | if (audioSource != null) { 531 | for (int i = 0; i < SWAP_BUFFER_NUM; i++) { 532 | if (audioSource[i] != null) { 533 | AudioClip.Destroy(audioSource[i].clip); 534 | AudioSource.Destroy(audioSource[i]); 535 | audioSource[i] = null; 536 | } 537 | } 538 | } 539 | } 540 | 541 | nativeDestroyDecoder (decoderID); 542 | decoderID = -1; 543 | decoderState = DecoderState.NOT_INITIALIZED; 544 | 545 | isVideoEnabled = isAudioEnabled = isAllAudioChEnabled = false; 546 | isVideoReadyToReplay = isAudioReadyToReplay = false; 547 | isAllAudioChEnabled = false; 548 | } 549 | } 550 | 551 | public bool setSeekTime(float seekTime) { 552 | if (decoderState != DecoderState.SEEK_FRAME && decoderState >= DecoderState.START) { 553 | lastState = decoderState; 554 | decoderState = DecoderState.SEEK_FRAME; 555 | 556 | float setTime = 0.0f; 557 | if ((isVideoEnabled && seekTime > videoTotalTime) || 558 | (isAudioEnabled && !isAllAudioChEnabled && seekTime > audioTotalTime) || 559 | isVideoReadyToReplay || isAudioReadyToReplay || 560 | seekTime < 0.0f) { 561 | print(LOG_TAG + " Seek over end. "); 562 | setTime = 0.0f; 563 | } else { 564 | setTime = seekTime; 565 | } 566 | 567 | print(LOG_TAG + " set seek time: " + setTime); 568 | hangTime = setTime; 569 | nativeSetSeekTime(decoderID, setTime); 570 | nativeSetVideoTime(decoderID, (float) setTime); 571 | 572 | if (isAudioEnabled && !isAllAudioChEnabled) { 573 | lock (_lock) { 574 | audioDataBuff.Clear(); 575 | } 576 | audioProgressTime = firstAudioFrameTime = -1.0; 577 | foreach (AudioSource src in audioSource) { 578 | src.Stop(); 579 | } 580 | } 581 | 582 | return true; 583 | } else { 584 | return false; 585 | } 586 | } 587 | 588 | public bool isSeeking() { 589 | return decoderState >= DecoderState.INITIALIZED && (decoderState == DecoderState.SEEK_FRAME || !nativeIsContentReady(decoderID)); 590 | } 591 | 592 | public bool isVideoEOF() { 593 | return decoderState == DecoderState.EOF; 594 | } 595 | 596 | public void setStepForward(float sec) { 597 | double targetTime = AudioSettings.dspTime - globalStartTime + sec; 598 | if (setSeekTime((float) targetTime)) { 599 | print(LOG_TAG + " set forward : " + sec); 600 | } 601 | } 602 | 603 | public void setStepBackward(float sec) { 604 | double targetTime = AudioSettings.dspTime - globalStartTime - sec; 605 | if (setSeekTime((float) targetTime)) { 606 | print(LOG_TAG + " set backward : " + sec); 607 | } 608 | } 609 | 610 | public void getVideoResolution(ref int width, ref int height) { 611 | width = videoWidth; 612 | height = videoHeight; 613 | } 614 | 615 | public float getVideoCurrentTime() { 616 | if (decoderState == DecoderState.PAUSE || decoderState == DecoderState.SEEK_FRAME) { 617 | return (float) hangTime; 618 | } else { 619 | return (float) (AudioSettings.dspTime - globalStartTime); 620 | } 621 | } 622 | 623 | public DecoderState getDecoderState() { 624 | return decoderState; 625 | } 626 | 627 | public void setPause() { 628 | if (decoderState == DecoderState.START) { 629 | hangTime = AudioSettings.dspTime - globalStartTime; 630 | decoderState = DecoderState.PAUSE; 631 | if (isAudioEnabled && !isAllAudioChEnabled) { 632 | foreach (AudioSource src in audioSource) { 633 | src.Pause(); 634 | } 635 | } 636 | } 637 | } 638 | 639 | public void setResume() { 640 | if (decoderState == DecoderState.PAUSE) { 641 | globalStartTime = AudioSettings.dspTime - hangTime; 642 | decoderState = DecoderState.START; 643 | if (isAudioEnabled && !isAllAudioChEnabled) { 644 | foreach (AudioSource src in audioSource) { 645 | src.UnPause(); 646 | } 647 | } 648 | } 649 | } 650 | 651 | public void setVolume(float vol) { 652 | volume = Mathf.Clamp(vol, 0.0f, 1.0f); 653 | foreach (AudioSource src in audioSource) { 654 | if(src != null) { 655 | src.volume = volume; 656 | } 657 | } 658 | } 659 | 660 | public float getVolume() { 661 | return volume; 662 | } 663 | 664 | public void mute() { 665 | float temp = volume; 666 | setVolume(0.0f); 667 | volume = temp; 668 | } 669 | 670 | public void unmute() { 671 | setVolume(volume); 672 | } 673 | 674 | public static void getMetaData(string filePath, out string[] key, out string[] value) { 675 | IntPtr keyptr = IntPtr.Zero; 676 | IntPtr valptr = IntPtr.Zero; 677 | 678 | int metaCount = nativeGetMetaData(filePath, out keyptr, out valptr); 679 | 680 | IntPtr[] keys = new IntPtr[metaCount]; 681 | IntPtr[] vals = new IntPtr[metaCount]; 682 | Marshal.Copy(keyptr, keys, 0, metaCount); 683 | Marshal.Copy(valptr, vals, 0, metaCount); 684 | 685 | string[] keyArray = new string[metaCount]; 686 | string[] valArray = new string[metaCount]; 687 | for (int i = 0; i < metaCount; i++) { 688 | keyArray[i] = Marshal.PtrToStringAnsi(keys[i]); 689 | valArray[i] = Marshal.PtrToStringAnsi(vals[i]); 690 | Marshal.FreeCoTaskMem(keys[i]); 691 | Marshal.FreeCoTaskMem(vals[i]); 692 | } 693 | Marshal.FreeCoTaskMem(keyptr); 694 | Marshal.FreeCoTaskMem(valptr); 695 | 696 | key = keyArray; 697 | value = valArray; 698 | } 699 | 700 | public static void loadVideoThumb(GameObject obj, string filePath, float time) 701 | { 702 | if (!System.IO.File.Exists(filePath)) { 703 | print(LOG_TAG + " File not found!"); 704 | return; 705 | } 706 | 707 | int decID = -1; 708 | int width = 0; 709 | int height = 0; 710 | float totalTime = 0.0f; 711 | nativeCreateDecoder(filePath, ref decID); 712 | nativeGetVideoFormat(decID, ref width, ref height, ref totalTime); 713 | if (!nativeStartDecoding(decID)) { 714 | print(LOG_TAG + " Decoding not start."); 715 | return; 716 | } 717 | 718 | Texture2D thumbY = new Texture2D(width, height, TextureFormat.Alpha8, false); 719 | Texture2D thumbU = new Texture2D(width / 2, height / 2, TextureFormat.Alpha8, false); 720 | Texture2D thumbV = new Texture2D(width / 2, height / 2, TextureFormat.Alpha8, false); 721 | Material thumbMat = obj.GetComponent().material; 722 | if (thumbMat == null) { 723 | print(LOG_TAG + " Target has no MeshRenderer."); 724 | nativeDestroyDecoder(decID); 725 | return; 726 | } 727 | thumbMat.SetTexture("_YTex", thumbY); 728 | thumbMat.SetTexture("_UTex", thumbU); 729 | thumbMat.SetTexture("_VTex", thumbV); 730 | 731 | nativeLoadThumbnail(decID, time, thumbY.GetNativeTexturePtr(), thumbU.GetNativeTexturePtr(), thumbV.GetNativeTexturePtr()); 732 | nativeDestroyDecoder(decID); 733 | } 734 | 735 | public void setAudioEnable(bool isEnable) { 736 | nativeSetAudioEnable(decoderID, isEnable); 737 | if (isEnable) { 738 | setSeekTime(getVideoCurrentTime()); 739 | } 740 | } 741 | 742 | public void setVideoEnable(bool isEnable) { 743 | nativeSetVideoEnable(decoderID, isEnable); 744 | if (isEnable) { 745 | setSeekTime(getVideoCurrentTime()); 746 | } 747 | } 748 | 749 | void OnApplicationQuit() { 750 | print (LOG_TAG + " OnApplicationQuit"); 751 | stopDecoding (); 752 | } 753 | 754 | void OnDestroy() { 755 | print (LOG_TAG + " OnDestroy"); 756 | stopDecoding (); 757 | } 758 | } 759 | } -------------------------------------------------------------------------------- /UnityPackage/Scripts/ViveMediaDecoder.cs.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 17ff68aac63c8274fb51bddad8df1d0e 3 | timeCreated: 1432804739 4 | licenseType: Store 5 | MonoImporter: 6 | serializedVersion: 2 7 | defaultReferences: [] 8 | executionOrder: 0 9 | icon: {instanceID: 0} 10 | userData: 11 | assetBundleName: 12 | assetBundleVariant: 13 | -------------------------------------------------------------------------------- /UnityPackage/Shaders.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 478c05aae5c7d264789f561f2f515b56 3 | folderAsset: yes 4 | timeCreated: 1452740843 5 | licenseType: Store 6 | DefaultImporter: 7 | userData: 8 | assetBundleName: 9 | assetBundleVariant: 10 | -------------------------------------------------------------------------------- /UnityPackage/Shaders/YUV2RGBA.shader: -------------------------------------------------------------------------------- 1 | // Upgrade NOTE: replaced 'mul(UNITY_MATRIX_MVP,*)' with 'UnityObjectToClipPos(*)' 2 | 3 | //========= Copyright 2015-2019, HTC Corporation. All rights reserved. =========== 4 | 5 | Shader "Unlit/YUV2RGBA" 6 | { 7 | Properties 8 | { 9 | _MainTex ("Texture", 2D) = "black" {} 10 | _YTex("Y channel", 2D) = "black" {} 11 | _UTex("U channel", 2D) = "gray" {} 12 | _VTex("V channel", 2D) = "gray" {} 13 | } 14 | SubShader 15 | { 16 | Tags { "RenderType"="Opaque" } 17 | LOD 100 18 | 19 | Pass 20 | { 21 | CGPROGRAM 22 | #pragma vertex vert 23 | #pragma fragment frag 24 | // make fog work 25 | #pragma multi_compile_fog 26 | 27 | #include "UnityCG.cginc" 28 | 29 | struct appdata 30 | { 31 | float4 vertex : POSITION; 32 | float2 uv : TEXCOORD0; 33 | }; 34 | 35 | struct v2f 36 | { 37 | float2 uv : TEXCOORD0; 38 | UNITY_FOG_COORDS(1) 39 | float4 vertex : SV_POSITION; 40 | }; 41 | 42 | sampler2D _MainTex; 43 | sampler2D _YTex; 44 | sampler2D _UTex; 45 | sampler2D _VTex; 46 | float4 _MainTex_ST; 47 | 48 | v2f vert (appdata v) 49 | { 50 | v2f o; 51 | o.vertex = UnityObjectToClipPos(v.vertex); 52 | o.uv = TRANSFORM_TEX(float2(v.uv.x, 1.0 - v.uv.y), _MainTex); 53 | UNITY_TRANSFER_FOG(o,o.vertex); 54 | return o; 55 | } 56 | 57 | fixed4 frag (v2f i) : SV_Target 58 | { 59 | float ych = tex2D(_YTex, i.uv).a; 60 | float uch = tex2D(_UTex, i.uv).a * 0.872 - 0.436; // Scale from 0 ~ 1 to -0.436 ~ +0.436 61 | float vch = tex2D(_VTex, i.uv).a * 1.230 - 0.615; // Scale from 0 ~ 1 to -0.615 ~ +0.615 62 | /* BT.601 */ 63 | float rch = ych + 1.13983 * vch; 64 | float gch = ych - 0.39465 * uch - 0.58060 * vch; 65 | float bch = ych + 2.03211 * uch; 66 | 67 | fixed4 col = clamp(fixed4(rch, gch, bch, 1.0), 0.0, 1.0); 68 | 69 | if(!IsGammaSpace()) { // If linear space. 70 | col = pow(col, 2.2); 71 | } 72 | 73 | UNITY_APPLY_FOG(i.fogCoord, col); 74 | return col; 75 | } 76 | ENDCG 77 | } 78 | } 79 | } 80 | -------------------------------------------------------------------------------- /UnityPackage/Shaders/YUV2RGBA.shader.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 474e00daf40fbd74eb14a17e9a98d821 3 | timeCreated: 1452677592 4 | licenseType: Store 5 | ShaderImporter: 6 | defaultTextures: [] 7 | userData: 8 | assetBundleName: 9 | assetBundleVariant: 10 | -------------------------------------------------------------------------------- /UnityPackage/config: -------------------------------------------------------------------------------- 1 | USE_TCP=0 2 | BUFF_VIDEO_MAX=64 3 | BUFF_AUDIO_MAX=128 4 | SEEK_ANY=0 -------------------------------------------------------------------------------- /UnityPackage/config.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: c72584ede802b444192080cee94e2a32 3 | timeCreated: 1484894173 4 | licenseType: Store 5 | DefaultImporter: 6 | userData: 7 | assetBundleName: 8 | assetBundleVariant: 9 | -------------------------------------------------------------------------------- /UnityPackage/readme.txt: -------------------------------------------------------------------------------- 1 | ViveMediaDecoder for Unity - v1.1.7 2 | 3 | Quick start: 4 | 0. Download the FFmpeg 3.4: 5 | - 64 bits: https://ffmpeg.zeranoe.com/builds/win64/shared/ffmpeg-3.4-win64-shared.zip 6 | - 32 bits: https://ffmpeg.zeranoe.com/builds/win32/shared/ffmpeg-3.4-win32-shared.zip 7 | Please notice that the license of FFmpeg official build is GPL and https is not default enabled. 8 | 1. Put the following dlls to the plugins folder(ex. If your project is at D:\SampleProject\, 9 | you can put the dlls to D:\SampleProject\Assets\ViveMediaDecoder\Plugins\x64\) 10 | - avcodec-57.dll 11 | - avformat-57.dll 12 | - avutil-55.dll 13 | - swresample-2.dll 14 | 2. Create a model with MeshRenderer(ex.Quad) and attach ViveMediaDecoder.cs as component. 15 | 3. Set MeshRenderer’s Material to YUV2RGBA and make sure the shader to be YUV2RGBA. 16 | 4. Fill in video path(ex. D:\_Video\sample.mp4) and enable Play On Awake. 17 | 5. Click play, now you should be able to see the video playing on the model. 18 | 6. If dll not found, check the followings: 19 | - In editor mode: 20 | * Make sure your Unity editor and FFmpeg dlls are consistent with either 32 bits or 64 bits. 21 | * Make sure FFmpeg dlls are placed in correct position. 22 | Ex. If your project is located in D:\Sample\, the following directories should work: 23 | D:\Sample\Assets\ViveMediaDecoder\Plugins\x86_64\ or D:\Sample\Assets\ViveMediaDecoder\Plugins\x86\ 24 | C:\Program Files\Unity\Editor\ 25 | Other system path in environment variables. 26 | - Standalone build: 27 | * Make sure your build settings and FFmpeg dlls are consistent with either 32 bits or 64 bits. 28 | * Remember to copy FFmpeg dlls to the directory where the build can find . 29 | Ex. If your project is built to D:\Build\SampleApp.exe, the following directories should work: 30 | D:\Build\ 31 | D:\Build\SampleApp_Data\Plugins\ 32 | Assign library loading path in your project.(please refer to Environment.SetEnvironmentVariable) 33 | Other system path in environment variables. 34 | 35 | Requirements: 36 | - The plugin currently only supports Windows / DX11. 37 | 38 | API lists: 39 | - bool isVideoEnabled: 40 | (Read only) Video is enabled or not. This value is available after initialization. 41 | 42 | - bool isAudioEnabled: 43 | (Read only) Audio is enabled or not. This value is available after initialization. 44 | 45 | - float videoTotalTime: 46 | (Read only) Video duration. This value is available after initialization. 47 | 48 | - float audioTotalTime: 49 | (Read only) Audio duration. This value is available after initialization. 50 | 51 | - int audioFrequency: 52 | (Read only) Audio sample rate. This value is available after initialization. 53 | 54 | - int audioChannels: 55 | (Read only) Audio sample rate. This value is available after initialization. 56 | 57 | - void initDecoder(string path, bool enableAllAudioCh = false): 58 | Start a asynchronized initialization coroutine. onInitComplete event would be invoked when initialization was done. 59 | Set enableAllAudioCh to true if you want to process all audio channels by yourself. In this case, the audio would not be play by default. 60 | 61 | - void startDecoding(): 62 | Start native decoding. It only works for initialized state. 63 | 64 | - void replay(): 65 | Replay the video. 66 | 67 | - void stopDecoding(): 68 | Stop native decoding process. It would be called at OnApplicationQuit and OnDestroy. 69 | 70 | - void setVideoEnable(bool isEnable): 71 | Enable/disable video decoding. 72 | 73 | - void setAudioEnable(bool isEnable): 74 | Enable/disable audio decoding. 75 | 76 | - void getAllAudioChannelData(out float[] data, out double time, out int samplesPerChannel): 77 | Get all audio channels. This API could only be used when the flag enableAllAudioCh is true while initialization. 78 | 79 | - void setSeekTime(float seekTime): 80 | Seek video to given time. Seek to time zero if seekTime over video duration. 81 | 82 | - bool isSeeking(): 83 | Return true if decoder is processing seek. 84 | 85 | - bool isVideoEOF(): 86 | Return true if decoder is reach end of file. 87 | 88 | - void setStepForward(float sec): 89 | Seek based on current time. It is equivalent to setSeekTime(currentTime + sec). 90 | 91 | - void setStepBackward(float sec): 92 | Seek based on current time. It is equivalent to setSeekTime(currentTime - sec). 93 | 94 | - void getVideoResolution(ref int width, ref int height): 95 | Get video resolution. It is valid after initialization. 96 | 97 | - float getVideoCurrentTime(): 98 | Get video current time(seconds). 99 | 100 | - DecoderState getDecoderState(): 101 | Get decoder state. The states are defined in ViveMediaDecoder.DecoderState. 102 | 103 | - void setPause(): 104 | Pause the video playing. It is available after initialization. 105 | 106 | - void setResume(): 107 | Resume from pause. It is available only for pause state. 108 | 109 | - void setVolume(float vol): 110 | Set the video volume(0.0 ~ 1.0). Default value is 1.0. 111 | 112 | - float getVolume(): 113 | Get the video volume. 114 | 115 | - void mute(): 116 | Mute video. It is equivalent to setVolume(0.0f). 117 | 118 | - void unmute(): 119 | Unmute video. It is equivalent to setVolume(origianlVolume). 120 | 121 | - static void getMetaData(string filePath, out string[] key, out string[] value): 122 | Get all meta data key-value pairs. 123 | 124 | - static void loadVideoThumb(GameObject obj, string filePath, float time): 125 | Load video thumbnail of given time. This API is synchronized so that it may block main thread. 126 | You can use normal decoding process to leverage the asynchronized decoding. 127 | 128 | Tools and sample implement: 129 | - UVSphere.cs: 130 | Generate custom resolution UV-Sphere. It is useful for 360 video player. 131 | 132 | - FileSeeker.cs: 133 | Used to get file path sequentially from a folder. 134 | 135 | - StereoHandler.cs: 136 | A tool to set texture coordinates for stereo case. 137 | 138 | - StereoProperty.cs: 139 | A data class to describe stereo state. 140 | 141 | - VideoSourceController.cs: 142 | Sample implement to load video one by one from a folder. 143 | 144 | - StereoVideoSourceController.cs: 145 | Stereo version of VideoSourceController. 146 | 147 | - ImageSourceController.cs: 148 | Sample implement to load image one by one from a folder. 149 | 150 | - StereoImageSourceController.cs: 151 | Stereo version of ImageSourceController. 152 | 153 | - config: 154 | Decoder config. If you want to adjust buffer size or use TCP for RTSP, put it to the directory that can be found(the same with FFmpeg). 155 | Use default settings if there is no config. 156 | 157 | Scenes: 158 | - SampleScene.unity: 159 | A sample to play video on a quad by following the quick start. 160 | 161 | - Demo: 162 | Demo some use cases which include video, stereo video, image, stereo image and 360 video. 163 | Please follow the steps below to make it work properly: 164 | 1. Add two layers named LeftEye and RightEye. 165 | 2. Found the objects named begining with "Stereo", set it's children's layer to LeftEye and RightEye. 166 | 3. Set the Camera (left eye)'s Target Eye to Left, Culling Mask to uncheck RightEye. 167 | 4. Set the Camera (right eye)'s Target Eye to Right, Culling Mask to uncheck LeftEye. 168 | 5. Modify the directory of each demo to your own path and click play. 169 | 170 | Changes for v1.1.7: 171 | - Fix imprecision buffer state. 172 | - Fix the issue of lost end frame. 173 | 174 | Changes for v1.1.6: 175 | - Fix all audio channels mode state error. 176 | - Add color space check in shader and remove YUV2RGBA_linear. 177 | 178 | Changes for v1.1.5: 179 | - Modify native decoding thread number to auto detection. 180 | - Modify document for using FFmpeg 3.4 for the issue of playback broken building with Unity 2017.2f under Windows 10. 181 | 182 | Changes for v1.1.4: 183 | - Rename to ViveMediaDecoder. 184 | 185 | Changes for v1.1.3: 186 | - Fix thumbnail loading crash. 187 | 188 | Changes for v1.1.2: 189 | - Improve software decoding performance by multi-thread decoding. 190 | - Reduce audio playing artifact by enlarge overlap length. 191 | - Modify FFmpeg dlls directory description in readme. 192 | 193 | Changes for v1.1.1: 194 | - Fix seek function. 195 | 196 | Changes for v1.1.0: 197 | - Modify native buffer management. 198 | - Modify audio play process for lower scheduling delay. 199 | - Open native config for seek function and buffer settings. 200 | 201 | Changes for v1.0.7: 202 | - Add native config file for buffer size control, use TCP if streaming through RTSP. 203 | - Fix RTSP playing fail. 204 | 205 | Changes for v1.0.6: 206 | - Fix buffering issue. 207 | 208 | Changes for v1.0.5: 209 | - Fix the dll loading issue for 32-bits Unity editor. 210 | - Fix the occasionally crash when call seek function. 211 | - Fix the issue that onVideoEnd event is not invoked correctly. 212 | - Add EOF decoder state and API. 213 | - Add API to get all audio channels. 214 | - Add thumbnail loading API. 215 | - Improve seek performance. 216 | - Set the index of FileSeeker to public(read only) and modify API toPrev and toNext to return index. 217 | 218 | Changes for v1.0.4: 219 | - Add API for video/audio enable/disable decoding. 220 | 221 | Changes for v1.0.3: 222 | - Add a shader for linear color space. 223 | - Modify the sample scene for Unity 5.4. 224 | 225 | Changes for v1.0.2: 226 | - Support streaming through https. -------------------------------------------------------------------------------- /UnityPackage/readme.txt.meta: -------------------------------------------------------------------------------- 1 | fileFormatVersion: 2 2 | guid: 76641e91644e02b4d82e9195e5c2b897 3 | timeCreated: 1470108931 4 | licenseType: Store 5 | TextScriptImporter: 6 | userData: 7 | assetBundleName: 8 | assetBundleVariant: 9 | --------------------------------------------------------------------------------