Forum Replies Created
-
AuthorPosts
-
Brian ChasalowParticipant
I actually found a way to do objective-c callbacks to Unity directly from the plugin, so i’m in the process of reworking the system to use NSNotifications instead of KVO. If i use NSNotifications, will they post the notification prior to a server retirement similar to the KVO option?
here is the guide i followed: http://www.tinytimgames.com/2010/01/10/the-unityobjective-c-divide/
but i had to make a bunch of changes after reading the embedded mono docs, will detail them once I get my shit together and make it all work how it should. It’s some voodoo shit that involves searching the compiled assembly for methods by name, caching those methods and invoking their constructors.
Brian ChasalowParticipantJust got it working with the Announce/Retire/Update notifications as well- the problem is really that there isn’t a good way to perform a C# callback in unity from C++ plugins.
The way I’ve been dealing with that previously is just polling a bool that represents ‘did something change?’ that is triggered by the KVO. This is not very explicit, which is a shame, as I’d like to perform different tasks in Unity based on whether it was an Announce/Retire/Update, while at the same time performing the least amount of C# -> C++ interop; this is more a design issue than a code one i suppose.
Brian ChasalowParticipantI’m ok with just receiving notifications after the event. my question i suppose had more to do with what the ‘object’ and ‘change’ were referring to. I know just enough objective-c to get myself into trouble.
why does the indexes key in the ‘change’ NSDictionary* refer to an NSIndexSet and not an NSNumber? Is it because there could be multiple changes happening in the same frame?
Brian ChasalowParticipantunity / build settings / player settings / run in background
Brian ChasalowParticipantthat’s a good question, regarding the GUI pre/post switchable situation. I will think about it as well and see if I can come up with anything.
edit: there’s no easy answer for capturing the OnGUI layer I’m afraid. I will keep at it, and do let me know if you have any luck.
Brian
Brian ChasalowParticipantif all you’re trying to do is capture the GUI, what if you send the unity screen texture to syphon in OnGUI() and do something like:
if(Event.current.type == EventType.Repaint)syphonServerPublishTexture(blah);
no idea if it’ll work, or even if this relates to your issue.Sounds like you have some effects you’re using OnPostRender with that are conflicting with syphon rendering order?
I’d take a look at a small unity project in a .zip showing the issue if you want to post one.
BrianAnton: he’s using Unity’s wrapped Texture2D.ReadPixels, which ain’t quite the same as the direct glReadPixels command. if he wanted to write a plugin he might be able to use glCopyTexSubImage2D and such, but that’s a whole nother can of worms.
Brian ChasalowParticipantthis question should actually have an easy solution. unity appears to render in order of script placement on a gameobject. make sure the syphon script is the last (lowest) script attached to the gameobject, and any bloom/glow/etc are above/before it.
Brian ChasalowParticipantoh. that gave me an idea, tom- thanks. since i’m already registered for the that notification, i can just wait til i receive the very first notification to switch my bool that says ‘my directory is loaded’
Brian ChasalowParticipanti’m so nostalgic for the days when I enjoyed using Max. It’s like a bitter divorce with an ex wife- she got the fishtank. 🙁 just sayin.
Brian ChasalowParticipantI couldn’t pinpoint your problem, but it probably has to do with the fact that Unity internally uses power of two texture sizes for everything. What I’d suggest is, for instance, to never use 1024 x 768 as the ‘desired’ size in syphon, and instead use 1024 x 1024. it will scale to your surface, then your can resize the aspect ratio of surfaces appropriately. This may be turn out to be an issue with projectors, which do not necessarily conform to surface UV’s in the same way that regular textures do.
From Aras : http://forum.unity3d.com/threads/18607-POT-and-NPOT-textures-problem
“And do not worry about hardware support; we do store textures at NPOT size (to save space), but at load time we create a texture that is next power of two, and pad that texture with dummy pixels (and if original NPOT texture is DXT compressed, we blit the DXT compressed blocks and fill the rest with dummy DXT blocks – so the resulting texture is still DXT compressed). And then we do some magic so that GUI that uses that texture is still always referencing the correct portion of that “padded up” texture.”Brian ChasalowParticipantwhat version of OSX are you running, and what version of Unity? It should ‘just work’
BrianBrian ChasalowParticipantSolved- my mistake earlier, it was a bad suggestion to use mainTexture- the Projector shaders use “_ShadowTex” in their shaders instead of _MainTex, so you need to access their textures differently.
First, use the Projector/Multiply or Projector/Light shader. Second, add the Falloff texture to the falloff texture of the shader (just like in the prefab Unity provides)
Then do this:
void Awake ()
{
_texture = new Texture2D(desiredWidth, desiredHeight, TextureFormat.ARGB32, false);
_texture.Apply(false);
Projector proj = gameObject.GetComponent<Projector>();
_texture.wrapMode = TextureWrapMode.Clamp;
proj.material.SetTexture(“_ShadowTex”, _texture);
}everything else should be the same/ just work.
Brian ChasalowParticipantthere are a number of settings that make a texture repeat like that, the most obvious is the ‘repeat’ setting of the texture’s wrap mode : http://unity3d.com/support/documentation/ScriptReference/Texture-wrapMode.html
but it also has to do with the UV’s of the model itself. the built-in cube is set up to repeat on each face like that.
from your screenshot, it looks more like you’re using syphon to receive a texture source on the cubes, not as a projector- but it’s hard to tell what’s going on from the screenshot.
if you can post a small example scene (in a zip file) i can take a look.Brian ChasalowParticipantyou cannot use the web player. try standalone osx build.
you should be seeing unity as a server after you build/run.Brian ChasalowParticipanthey there. please see the google code implementation site as it has an updated version that does not require any of the ‘makesyphonclientwork’ band-aid fixes.
there are a few more updates i need to commit but I am out of town on a job right now, will get to it when i get back. cheers,
brianBrian ChasalowParticipanthey there- Brian Chasalow here, I’ve been working on the Unity implementation. i am 99% certain that you can, but you would have to modify the syphonclientbridge to get the texture ID of the projector’s material, as a projector is not a ‘standard renderer’ like most other objects.
you’d need to use the projector.material.mainTexture i think.so, wherever in the syphonClientBridge it refers to mainTexture, switch it with the projector’s material instead of the renderer.material
to get the projector’s material and native texture id: (untested code, warning)
Projector proj = GetComponent<Projector>();
proj.material.mainTexture.GetNativeTextureID();keep me updated with your progress! next week I can help you more but I’m out of town right now. good luck,
brianBrian ChasalowParticipantdumb question, but how would you load a separate patch in a standalone? some hackery with a bpatcher?
Brian ChasalowParticipantand this probably goes in the ‘implementations’ forum. oops.
Brian ChasalowParticipantjust as another note-using these 2 scripts appears to allow you to use unity as a syphon server WITH transparent shaders on your objects, as long as you have a skybox, and as long as you’re ignoring alpha channels in your client.
It was definitely unhappy when trying to combine server + client re-routing in and out, though…
Brian ChasalowParticipantSteveElbows: that’s definitely on our radar, the multiple server support thing for Unity as client.
I checked out your demo- those are certainly similar issues to what we’re experiencing. For now, do four things:
1)go to file/build settings/player settings, and check ‘run in background’
2) go to edit/render settings, and drag a global skybox there to where it says ‘skybox’ from your project window.
3) replace the SyphonServerBridge with this script:using UnityEngine;
using System.Collections;
using System.Runtime.InteropServices;public class SyphonServerBridge : MonoBehaviour {
//1)go to file/build settings/player settings, and check 'run in background'
//2) go to edit/render settings, and drag a global skybox there to where it says 'skybox' from your project window.private GameObject _RTCameraObject;
private Camera _RTCamera;
private RenderTexture _rTexture = null;
public int desiredWidth = 1024;
public int desiredHeight = 768;
[DllImport ("SyphonUnityPlugin")]
private static extern void syphonServerPublishTexture(int nativeTexture, int width, int height);
[DllImport ("SyphonUnityPlugin")]
private static extern void syphonServerDestroyResources();// Use this for initialization
void Start ()
{
_rTexture = new RenderTexture(desiredWidth, desiredHeight, 24);
_rTexture.format = RenderTextureFormat.ARGB32;
_rTexture.isPowerOfTwo = false;
_rTexture.isCubemap = false;
_rTexture.Create();
//create a new render texture camera object, parent it to this object
_RTCameraObject = new GameObject();
_RTCameraObject.transform.parent = transform;
_RTCameraObject.name = "RTCamera";
_RTCameraObject.AddComponent("Camera");
_RTCameraObject.AddComponent("makeSyphonServerWork");
//clone the current camera's settings to the RT Camera
_RTCameraObject.camera.CopyFrom(camera);
//add the render texture target to the RTCamera object
_RTCameraObject.camera.targetTexture = _rTexture;
//<edit nov 7 2010 - bc>
RenderTexture.active = _rTexture;
//</edit>
}
public void callFromRTCamera(){
syphonServerPublishTexture(_rTexture.GetNativeTextureID(), _rTexture.width, _rTexture.height);
}
// Also called in the editor when play is stopped
void OnApplicationQuit ()
{
syphonServerDestroyResources();
}}
4) then create a new file, call it makeSyphonServerWork.cs, and write this to it:
using UnityEngine;
using System.Collections;public class makeSyphonServerWork : MonoBehaviour {
private SyphonServerBridge myScript;
// Use this for initialization
void Start () {
myScript = transform.parent.GetComponent<SyphonServerBridge>();
}
static Material lineMaterial;
static void CreateLineMaterial() {
if( !lineMaterial ) {
lineMaterial = new Material( "Shader "Lines/Colored Blended" {" +
"SubShader { Pass { " +
" Blend SrcAlpha OneMinusSrcAlpha " +
" ZWrite Off Cull Off Fog { Mode Off } " +
" BindChannels {" +
" Bind "vertex", vertex Bind "color", color }" +
"} } }" );
lineMaterial.hideFlags = HideFlags.HideAndDontSave;
lineMaterial.shader.hideFlags = HideFlags.HideAndDontSave;
}
}
void OnPostRender() {
CreateLineMaterial();
lineMaterial.SetPass( 0 );
}
void OnRenderObject() {
CreateLineMaterial();
lineMaterial.SetPass( 0 );
myScript.callFromRTCamera();}
}
-
AuthorPosts