This one – no, and the sensor information is not fused in any way. Actually, I would suggest using quaternion implementation rather than Kalman filter like here http://www.himix.lt/?p=915 (sensor fusion is done here), when you use the array of 3 sensors (accelerometer, gyroscope and magnetometer). The use of Kalman filter would not provide noticeable improvements over quaternions (I’ve did lot’s of experimentation).
BUT! If you use only one sensor, for instance, accelerometer, I would recommend using Kalman filter.
I’m planing to make KF tutorial in near future.
This book goes together with DVD disc in which you find the Augmented Reality software. It might be games, it might be some other exciting things related to Augmented Reality. So this is only DEMO of the book with games that someone else developed.
I want to make final year btech project on agumnet reality so i need such things. You have also mentioned about the ultimate project, what is it ? Can you help me in building a project with me ?
I can tell you that Ultimate project won’t be available for free, it is something that I’ve been working and improving for several years, it’s a combination of Augmented Reality and Arduino.
Tutorials – is the way that I help. You can’t find anything that would fit your idea? What is your idea?
My idea is i want image recogination and text recogination connected with internet. Like if i want to know about a book , i just point my camera on the book cover and it will tell me the reviews. Its basically a part of Sisth sense technology developed by pranav mistry.
Image and text recognition is basically solved in my tutorials, but it is all predifined (images, text). If it is all predifined it would be easier. What worries – the search on the internet. But it might be that you want a little bit different application, for example, take any books cover picture, recognize it properly and make a search on the internet?
Watch closely video, there are two parts, one for arduino and photoresistor, the other one for Processing and Augmented Reality while acquiring photo-resistor information.
Hi! I’m new to AR and Unity but I have been a software developer for over 10 years. Thank you in advanced, I will try your tutorial. It looks like a lot of fun! 🙂
Hi, I haven’t started this tutorial yet. But maybe you have some clothes I could use in the future for this tutorial? I won’t need to search it by myself.
If you ask generally, then of course it is possible. If you ask me will I do it? I will, but I can’t tell you whether it will be available as tutorial or DEMO. In the future we’ll see. 🙂
hello sir, i m a student i want to develop my own project which is interior designing using marker less augmented reality how can i do that can you give me a demo on that topic which i can use as a reference for my project…. please give me a demo on interior designing using augmented reality…
actually i m very confused so it would be really great if you help me out so please let me know how it work through your demo for interior designing augmented reality… i have prepared the marker based interior designing by taking reference of your demos but now i want to do it in markerless sooo please give me one demo on it please
This is where I suggest you start: https://www.youtube.com/watch?v=qfxqfdtxyVA
This is a markerless AR. Just start from adding your interior design content. No need for a seperate tutorial on this.
sketch_aug14b:32: error: ‘ADXL345’ does not name a type
sketch_aug14b.ino: In function ‘void setup()’:
sketch_aug14b:40: error: ‘adxl’ was not declared in this scope
sketch_aug14b.ino: In function ‘void loop()’:
sketch_aug14b:46: error: ‘adxl’ was not declared in this scope
pls any one say how to solve this error…?
i just download the library and paste it …still im getting this error… what i have to do…
reply as soon as possiable
Are you sure you copied library in the right direction? Could you paste me a path to this library?
After you copied library did you restart the Arduino itself?
suppose i want to take run time image directly to my application and user can place wherever he want how can i do tht ?? which platform will be suitable for my project?? suppose i want to take image from a online shopping site as input to my application and as output user will see how that interior look like..
Just to make it clear, you’re talking about the app using META glasses, right? By saying “runtime image directly” you mean that take a snapshot from camera with augmented content, and place this taken picture in any place you want in augmented reality?
Hello very good job I have this
Error DllNotFoundException : MetaVisionDLL
Meta.CanvasTracker.Release ( )
Meta.CanvasTracker.OnDestroy ( )
Would know how to fix it . Thank you
Hello, I was wondering if I could use this library on a 2-axis accelerometer. I will download the library now and see if you utilize function overloading so that I can pass in only values for the x and y axis; but if not, do you have any ideas? I have a 2-axis accelerometer that is hooked up for I2C ONLY. Please let me know if you have any suggestions or advice. Thank you. -Joe
ADXL345 is a 3-axis accelerometer and using the library provided, you should not have problems in acquiring that information. I2C works perfectly for that.
Instead of making confusion with words this is what i want to create an android application for interior designing with augmented reality…. https://www.youtube.com/watch?v=ipkz6y9mfvk
…. i want to create an android application in which the user can buy the furniture from the online shopping sites and by using my app user will be able to see the augmented view of those furniture at their place and if they like it than they can purchase it… i hope this time i m clear what i want to say…
hmm ya i know its not so simple n so easy.. but i want to do it.. not for a particular output but for sake of knowledge please can you guide me for this?? i will do my best.. just guide me i will work hard on it… please..
okay thanks… but i have to submit it as my college project so let me know from where should i start??? which platform will be suitable for my project and what should i do first?? just let me know that means i will be able to start my work …
CAN you give me a way to do this: after track money paper and get alot of papers , when I zoom in the virtual papers I want to replace it with another image
…
thank u
Hi, greetings from Rio de Janeiro, Brazil! First, I wanna thank you for all your tutorials, I am trying to learn more about Unity3D since I started to watch your videos. But I have this question: If I want to use my smartphone as a stereo glasses with augmented reality, does Vuforia generate an output app for this? Thanks once again and I hope your ideas help to transform our world into a better place!
Hello, Ricardo. Thank you for kind words.
Actually I don’t know the answer, I haven’t done anything alike so far. I mean I didn’t tried to use smartphone as AR glasses. But if you find some useful info on internet later on while researching, please let me know, I’m interested in everything related to AR.
Yeah, it’s not in the gallery, but it’s somewhere in your device’s memory. If you know how to modify this code in order to send the pictures to the gallery, please, let us all know! So far I achieved this (saving pictures to gallery) only by using Unity3D assets/plugins, which comes with a price.
Hi. I have downloaded Processing 3. When I run simpleLite the following error is shown in the console. “No library found for processing.video
Libraries must be installed in a folder named ‘libraries’ inside the ‘sketchbook’ folder.”
Any Solution
Hi,
I don’t have any camera on my computer, athough I would like to use the camera of my mobilephone, is it possible to make the program search for IP address of a camera?
Cheers
i downloaded your project from this site
only one object is view at a time i want to run as in your video
i run app in android mobile any setting require to run app in multi object mode
I would suggest to start recognizing Kuriboh card firstly, after that blue eyes white dragon. Actually, Kuriboh card has not so many good tracking features. Before using the app on android did you test in Unity3D play mode? I would suggest you do it so and check it out how well multitracking is accomplished.
Thanks! Actually I don’t plan to make a tutorial on google glass as I don’t have them. I have AR META glasses as I’ve seen more potential in it. Who knows what future will bring 🙂
hello, i been trying to making Cylinder Target based on your tutorial
and while i’m trying upload the SIDE image on vuforia target manager, it failed saying the image Euforia of Beauty Logo dont match the dimension, cant you post the tutorial how to measure the image so it can be uploaded on target manager
I’ve added rescaled image (Download # Print Euforia of Beauty Logo to Augment the Content and Create the Tracker for Cylindrical Object (*.jpg file)), you can try to upload once again.
hello, i tried to make rotation button on adroid
but its always looping
the only way to stop it, is hold the button
but if i release the button, its looping again
Thanks a ton for all these knowledge sharing and I have become a serious follower of your tutorials and has also subscribed your youtube channel. These tutorials are great assets for people like me who are getting in to AR field.
Hi I am totally new with AR and any software developer ( not a programmer at all), Thank you, I test it out and it works. But I have a question, If I want to use my own 360 picture, how can I upload it and use it?
Hi, Thank you for the tutorial, it is a really good kick start for someone who is totally new and want to learn like me. However, I have a question, after doing everything you did in the video, when pressing play, how do I like it to my android? Are there any videos you did which explains?
Hi,
Thank you for the tutorial video, it is a really kick start for someone totally new like me.
It would be great if you can help me with my question.
After doing everything you did in the video with a PC, and press play button, how do I link the program to my android device?
I have been trying to run this code for hours but for some reason the Arduino 1de ver 1.6.5 on windows 10 cannot find HMC5883L.h. I’ve placed copies of the library in the main library, the sketch book library and the hardware library but it continues not to find it. I also used the library manager. help
Hi, I have windows 10 x64, I tested it right now and it works perfectly. Make sure that the library directories looks something similar to this:
C:UsersEdgarasArtDocumentsArduinolibrariesHMC5883L_libraryHMC5883L_Example
not to this:
C:UsersEdgarasArtDocumentsArduinolibrariesHMC5883L_libraryHMC5883L_libraryHMC5883L_Example
Depending on how you extract the libraries, it might have two same folders one in another (HMC5883L_libraryHMC5883L_library\HMC5883L_Example) so I assume Arduino can’t recognize it.
Hi, I don’t know why but arduino software don´t recognize the ADXL345 library. The program shows it in black instead of orange.
The folder is in this directory: C:UsersLeyreDocumentsArduinolibrariesADXL345_library with the other libraries (which it recognizes well) and the folder ADXL345_library is not duplied.
Nevertheless, when compiling the software doesn´t show any error.
Any idea? I need help please.
First you said it does not recognize the library, but then you say that it “doesn’t show any error”. I don’t understand. How do you know that it can’t recognize the library without the errors written?
Thank you for answering.
The programm shows the library and all the functions related to accelerometer in black instead of orange whereas the rest are in orange. Isn’t that weird?
Nevertheless it compiles and I can upload the programm to the Arduino UNO Board (although the data adquired is really weird).
That’s really weird, it’s hard to suggest something right now, but answer to this question: do you really use GY-85 board? yes or no?
This is important because I had some weird data readings while using board with only accelerometer alone and I couldn’t find the solution for that.
Hi again,
I have been trying the code the entire day but i don’t understand the outputs.
I want the angles in degrees. On the one hand i get the rolldeg and pitchdeg between 20 and -20 instead 90 and -90 degrees.
On the other hand when i show anglegx, anglegy and anglegz i get signals which change even when the Imu is stationary.
I would like to add to your code a complementary filter but with those output data i can’t.
I have watched the video and it seems to me that output data shown are correct but i don’t get the same.
Could you help me please? I’m a little desperate because my project depends on an good measurement of the angles.
Best regards and sorry about the mess
If you would send me some pictures of the sensor wiring to arduino microcontroller and screenshots of error in the program maybe I will be able to help you.
Hi Edger , thanks for this great Tutorials ,
it had been a great help ,
i’m just wondering why you don’t had Audio in this Tutorials speaking for what you do and explaining it provide mush more help
Thanks for your reply, I really need a tutorial on how to use vuforia and cardboard SDK.
The app would scan the image target to track the AR world, and the user would be giving a button that when looked at will teleport the user into the VR world and Vice verser. just like the vuforia sample.
I am having the same problem after checking it in error this variable :
ADXL345 adxl; //variable adxl is an instance of the ADXL345 library
Arduino_AccelerometerADXL345_Servos:32: error: ‘ADXL345’ does not name a type
Arduino_AccelerometerADXL345_Servos.ino: In function ‘void setup()’:
Arduino_AccelerometerADXL345_Servos:40: error: ‘adxl’ was not declared in this scope
Arduino_AccelerometerADXL345_Servos.ino: In function ‘void loop()’:
Arduino_AccelerometerADXL345_Servos:46: error: ‘adxl’ was not declared in this scope
Hello! Student studying physical computing here. To run by “dropping in,” are there any constraints for dimensions/scale or file size for the OBJ model?
If your *.OBJ model size is huge, most likely, the model itself is really complex and has lots of vertices/polygons. I don’t know the exact constraints on this matter, it also depends on your computer specifications. You should sort this thing by experimenting with the models you have (if it’s really complex).
hello when I move the fast sensor, servo motor lock and does not follow the movement, and is making a noise (tec tec tec) in the servo motor, and then resetting it back to normal, you can say what can be ?
I’m using servomotor Towerpro MG946R direct 5v power by the Arduino.
Hello! good day!I am a beginner and i need your help please help me. I downloaded ur code ,i have the adxl345 3-axis digital acceleromter and 2 Servo motors. i connected the ryt connections u’ve indicated above. But like other reviews say, there are some errors….what should i do sir? and where should i start…..i dont understand the library stuffs u mentioned..what library is it?where could i find it??and where to copy
For starters, I didn’t make the app for this book, this is only a DEMO of what other developers did. I haven’t tried yet, but one of the users provided a link (Vuforia standalone) in order to make vuforia work on PC platform. However, I haven’t tried it yet and right now I can’t find the link it’s somewhere in one of the tutorials comments place.
Hi, I already made those examples but I have a question do you know about the Dragon Board 410c, I want to play mi app in those board but I installed the Ubuntu Linaro but my app doesn’t work with this … Do you know or do you export and play some of your apps with some other board ??
HI
i will buy the meta dev kit1 next Time. I have a question to your work
with which Prgramm you make this videos ?
On the Videos the FOV is extrem Big ! Is this alone on the video or on the Meta to ? I have a bt-200 an this device have an smale FOV on the near distance.
Have you contact with META when appears dev kit 2.
I suggest to compare the specifications of bt-200 and META glasses. I didn’t had the opportunity to use some other AR glasses (only META) so it’s hard to compare, but those who put those META glasses for the first time says that FOV is not so big. So, of course, watching video here, and using it for real is like day and night.
Hi ! Great tutorial but I have stuck on building part … I am doing same as you on movie but my apk is not working on android … it has black screen … I never developed for android and I belive I am missing something in setup. My projetct works great in unity preview, whole building process is ok to [no erroes] but after copying it on device and installing nothing going on. Can u point me in some direction. I need to run project on phone and all seems to be ok but this.
Thanks a lot!
It’s hard to say, but if it’s not “top secret” project send it to me and I’ll take a look. We can start from *.apk file. Maybe it’s a smartphone’s problem. Who knows. It’s also worth googling this problem.
Hi
Thank you so much… this is very useful to beginners like me.
I have a one question.
I tried this with two different object and tried on android device. The issue is my both objects are visualize from starting.. Buttons didnt work…
Did you put virtual buttons on textured tracking object? If not, I suggest to make it so. Don’t put virtual buttons on plane without any textures. I hope that helps.
Hallo,
thx for your answer.
I think Meta brings in the next Time a new Version of their glasses out. I think the name is DevKit 2.
Have you any information about this Release ?
I can see that you’re not using Win OS so it’s hard for me to suggest something. Also, you’re using Processing 2.1.2 not 2.2.1 but I don’t think that this causes the problem.
Hi, thousand thanks for your code! i tried several not working but this works so perfectly!
Btw, I will need to interface with two sensor, I dont know how to change the code to measure for two, could you please help me please?
No, I can’t. This is raw data readings from the sensor.
alright.thanks anyway
Hi,
i’m and Interactive Developer using Unity3d, i sincerely need your help on how to integrate Vuforia and Googlecardboard sdk in Unity 3d for Architectural Visualizations.
Hello, John. Haven’t done anything on that yet. But I will soon, you can wait for a tutorial (2 to 3 months), but if you want it to be quicker it won’t be for free.
But i received at the time of program run. It couldnt find the path of ICSharpCode.SharpZipLib.dll to Unzip the file. I couldn’t understood. I am not programming guy…
Can you help me?
Is Csharp is language supports Andorid?
Sorry it is difficult to me to understand script and do something on it..
C# is supported by Unity and using Unity we can export an app for Android devices (and other OS’s). I really haven’t stumbled upon this problem, can you copy-paste the directory to your project files. Have you tried anything simpler up until now? For instance: http://www.himix.lt/augmented-reality/augmented-reality-using-unity3d-vuforia/ I suggest you start from here and then move on.
Can pls share d code?
Actually i needed the code with which u had drawn the rectangle around the maker and how to return the coordinates of corners of the marker.
Well, actually there is, but it would be more or less a “workaround” that I would not suggest to use. Now what do I mean? basically, you would need to export your model animation for every frame. So most likely you would end up having lots of *.obj files, where you would need to load these models frame by frame in void draw() part.
Option 2: search for a library that would be able to import animation file (*.fbx file extension). I couldn’t find for a better solution back in these days, but maybe it is available now. Who knows…some research is needed.
Option 3: Try doing AR tutorial No 14 that involves Unity3D and Vuforia, you can add animations there quite easily.
Thanx for your advice, actually i went through all your videos and the unity ones are quite cool . But actually i am newbie for unity and coding any logic into it is becoming difficult.
But i will try on the fbx part you told.
hi thanks for all tutorial ,
I made my ar apk unity+vuforia my mesh model is from my own sketch up models (import fbx to unity). when i try on pc is workin fine, but when i installed it on my smartphone it works but slow response and lag. any suggestion for size ar apk? (mine is 187kb)
next question… how to make quit button on app ..? thanks ..
now I’m develop ARVR app, is this snapshot button just to snap on-screen interface ?
is this work on stereoscopic interface ? i want it just a monoscopic(single) image saved.
is it possible to give us any info or hint about the making of this demo or can you please mention any reference can we used to learn scripting in unity and achieve same results
yes I know of course you using distance script……thanks……but the main part it’s how to modify the distance script parameters or how to write a new script me by using a playmaker or something else
what is the best way to learn to script in unity
Hello,
Is it possible to turn the buttons into virtual ones that can be pressed with your hand like you demonstrated in a another tutorial (No. 19)? I don’t succeed when I try to merge the code UI from this tutorial and the one of Virtual(Vuforia) buttons…
I thought it’s just because I’m a lame coder… Is there a way to extend interactions with virtual buttons (Vuforia-Unity)? For instance, jumping to a next scene or playing a video?
Btw, thanks for all your tutorials, they are very helpful.
Yes, there is a way. The same way I switch models in this tutorial (http://www.himix.lt/augmented-reality/augmented-reality-virtual-buttons/) you can add different functions – load another scene and so on. Just dive in the code “VirtualButtonEventHandler.cs” (starts from case “btnLeft”: and case “btnRight”:)
I will try until I make it, thank you very much!
There’s one more issue no one could answer at vuforia’s forum: is it possible to trigger a whole environments in which the viewer can dive?
Should I use extended tracking (the triggered image would be much bigger than the marker) and keep the image target active even when tracking is lost (can I just disable this function to keep it on even if I turn the device/camera in other direction)?
I’ve read about different plug-ins like Unified Coordinate System that could help building augmented environments… Could you point a direction I should go?
Cheers!
It is possible, but I haven’t done this in Unity. Basically what you need I did it here with MARG sensor (http://www.himix.lt/arduino/arduino-and-virtual-room-using-mpu-9150-marg/), just with pictures and without tracking any image target. Same stuff applies to smartphones and tablets. I haven’t heard anything about the plugin you mentioned.
hi.. how can i connect it to the ethernet shield?? and what could i established for an ouput device using flame sensor?
I am serious, currently there’s probably even a newer version. This asset was downloaded not from unity asset store, but from leap motion companies website.
Haii .. Thank you so much for this awesome tutorial . By the way, can a particle system be controlled by our 3d object instead of using arduino ? For example, when i click 3d object such as factory, the particle system for an example, smog will emitted.. im trying to make an interaction with my AR project. Really hope you can help me, tho. thank you.
can i implement the app on tablet connected to external camera ????
and how you track each part of your body???? can Kinect distinguish each part and give it a tag?
I don’t know whether external camera can be connected to the tablet, I haven’t tried to do so.
And yes, Kinect can distinguish different parts of your body, I mean track your body parts/joints, its position and orientation.
hola soy de peru . he vist o los tutoriales de unity3d sitio web y los scripts no son correctos .. gracias admin por la ayuda que nos brindas en tus tutoriales por que los ejemplos de unity web sites no me resultan nada
“hello i’m from peru. i visit the tutorials of unity3d on the website and the scripts is not the right… thanks admin for the help that you gave to us on your tutorials because the unity’s examples in the website not help me.”
-Something like that!, sorry for my bad english too!
I’m amazed, I have to admit. Rarely do I encounter a blog
that’s both educative and entertaining, and without a doubt, you’ve hit the nail on the head.
The issue is something which too few men and women are speaking intelligently
about. I’m very happy that I came across this in my hunt for something regarding this.
Hi, I have followed your tutorial and everything works fine in play mode but I cant build project on android.
Worth to mention that I configured everything like you show on money tutorial and that project works fine … can u help in this matter or is text rec not working on android ?
Tkanks in advance.
Bart
May I ask, what if there is two model in 1 scene? Because your tutorial only have 1 model in 1 scene. Then, what happened to the tag? Can I tag both of my model as Model?
I have tried set both character same tag name as Model, but it does not work.
What happen is only 1 character that scale up and scale down when I click the button, the other one did not happen anything.
I’m creating a scene that has two character, one person performing CPR and the other person is the patient. I need both character to scale up and scale down at the same time when the button is clicked.
How to create or how to use already created models?
You can create models using 3Ds Max, Maya, Blender, SolidWorks and lots of other 3D modelling tools.
How to use it? you should put the model in data folder and change some Processing code (you will find out if you watch closely).
How do I reset the animation if the target has been detected? It seems that the Animation just pause/continue playing if The target has not been detected.
hi, great tutorial, so thank you, you help me a lot learning AR, can you help me with one question when i installed it says ok the moneyar.apk but when i open it, the screen get black, and nothing happend 🙁
Great Tutorial thanks, but i want to ask question i developing a vuforia app,
but when i taking a screenshot using your code my result is only white screen and the Augmented view only with white background can you help me pls?
sorry for my bad english, thanks
Hi, I did the same way as u did and everything worked. Just that the UI buttons will be on screen even when the image is not tracked. The UI Buttons would just stay at the last place it was tracked. I wanted to do when the image is tracked the buttons appear where the should and when is not tracked it would disappear. Please help. Thanks in advance.
Hi first thank you for your job ! sure i have the same problem i did everything but the UI buttons are on the screen doesnt matter, if canvas are in or out ImageTarget the result has same the ui button are always on screen.
Please help. Thanks you.
I have the same problem as Nqb..I have tried making canvas as child of Image Target. But still canvas is rendered onto the screen even if image is not tracked. Can someone please help me out?..Thanks in advance
Hey could you suggest any tutorials for using real world marker input instead of virtual buttons? The plan is to make an application which reads the position of a real world marker and responds based on it’s hovering over a real world button which is printed on the paper page instead of virtual buttons.
I got same problem with Nqb, and i already put the canvas with buttons inside image target, but the result still same, the button still popped out when the marker is lost
hello, thx for the tutorial its helpful 🙂
as far as i understand the primary surface used to track the scene which could be as the size of a dinning table, what if there is 5 image targets on 5 different places ( not far away), would that extend the size?
example: lets say i did put 4 image targets at the edges of a table, would they all be tracked at the same time and the “props” would be the same for both? , or each target would define its own scene and props?
This is something that you will have to test on your own, but I would say each target would have separate “props” not the same. This is my logical guess.
yes man, on pc i can see the button and the panel, on phone it seems like they’re invisibile, when i tap randomly on the screen tapping them, they work but i can’t see them!
Hi, I downloaded your project, and when i exported it for android I can’t see the button on smartphone, but when i download your apk,works, what’s the problem?
I have restested just right now (exported the apk). Works great actually.
Hi! Thanks in advance for your work! I tested and worked very well.
I’m looking at the code and I have some stupid question to ask you. I would be glad your feedback. (sorry in advance for the dumb questions.. but I’m not an expert in arduino)
0.1- the raw output of the acc is what? voltages?(from readAccel)
0.2-the raw output of the MM is what?
0.3-the raw output of the gy is what?
1-line47: reading gyros acc and mm, you have a FOR loop of 201. May I ask you why?
2-line88: why 255? there is a pre-set offset of 255deg?
3-line89: why are you dividing by 256 the raw data?
4-line92->95: I don’t understand what u are doing here.
5-line113->115: why is divided by 14.375?
I was looking at you comment in the 15-05-2015. It is interesting: Kalman filter doesnt improve the attitude determination with quaternions. may I disagree? the kalman filter (if well tuned) would improve drastically the representation of the attitude during time, enabling what is so called “smooth” representation.
Here (http://www.himix.lt/?p=915) are you using just quaternions? no KF? right?
Have you ever tried to implement it on arduino uno?
I had rumors that is impossible due to limited memory?
Thanks in advance for your kind answers,
I really appreciated your wonderful job! It works nice?
p.s. do you have an oscilloscope for dumb Macosx users?
1. Concerning all first questions – look up some theory on the internet how it works and read sensor datasheets it will answer lots of your questions.
2. “I was looking at you comment in the 15-05-2015. It is interesting: Kalman filter doesnt improve the attitude determination with quaternions. may I disagree? the kalman filter (if well tuned) would improve drastically the representation of the attitude during time, enabling what is so called “smooth” representation.” – show me some proofs of that “drastical” improvement. Quaternions already have smooth representation.
3. “Here (http://www.himix.lt/?p=915) are you using just quaternions? no KF? right?” – correct
4. “Have you ever tried to implement it on arduino uno?” – Yes I have tried it.
5. “I had rumors that is impossible due to limited memory?” – Wrong, there’s enough memory.
6. No, I don’t have oscilloscope for Mac.
I did everything perfectly but when I rotate the object, it does not rotate on the Y axis but it makes a combination that continuously sends it down making it impossible to orient. Why? How can I fix it?
I know this might a stupid question but I noticed when Opening your Project that I downloaded In this site unity immediately open the “Game” tab and the “Scene” tab is missing I was wandering on how did you do that. Thank you so much on this tutorial reaLly learned a Lot on this experience.
Thanks, I’ve managed to find a solution on these forums! But I’m now facing another issue as I have multiple targets. It works great, until I click the camera button and track another target : the share button from the previous track still appear… Is there a way to restart/disable your script while on OnTrackingLost?
I ended up duplicating your script and calling the matching canvas for each ImageTarget. I don’t know if it’s the best thing to do but it is working! Sorry for the bother and thanks again for your tutorials!
After tests on several devices, I am facing few troubles on a tablet using Android 4.4.2 :
– If I take a screenshot of an ImageTarget and share it right away, my app restart.
– If I take a screenshot of ImageTarget A without sharing it afterwards and take another one of ImageTarget B right after my app close.
It is working great on smartphones using Android 4.2.2 and 5.1.1 though, any idea what the problem would be?
Hi,
I downloaded the script file and loaded it directly in my scene. It didn’t work. Both the buttons and models appear as soon as i enter the play mode.
Later i also tried changing the names of buttons and models according to what I have named them on my scene. It now shows shows me an error to fix the compiler.
Could you please guide what all attributes are to be changed before loading the script.
Hi, your tutorials are very helpful great job! Got one question is it possible to take a snapshot with interface graphic elements? In my case the snapshot is working but without augmented layer. I’m working on simple app with OpenCV ForUnity. Maybe I have to change the camera name in your script?
Thank you and pleas keep your tutorials coming!
Hi,
I run this code with GY-85 BMP085 sensor. ADXL345 library is in black instead of orange in the program. Also, at the serial monitor, the values does not change and they are always “0.00, 0.00, 92.50”. I don’t understand why it is. I need help 🙁
I actually know what your problem is. On my code, ADXL345.h is also black, but it runs just fine. I get the same problem if I move the libraries to a wrong location.
So to explain, When my code works correctly, I have my main folder labeled “Arduino”. Within that, I have a unique folder for each “.ino” file, labeled the same way as the “.ino” file (minus the .ino) and a folder labeled “libraries”. All of the libraries go in the “libraries” folder, then are saved in another folder titled with the name of that library followed by “_library”. For example, it goes:
Arduino>libraries>ADXL345_library>”all contents of that library”
I have the problem where my serial monitor values are always “0.00, 0.00, 92.48” if I move the libraries from the “ADXL345_library” folder to the “libraries” folder.
I don’t know if that actually makes a difference but if it was the same problem so hopefully this helps you fix it!
Your tutorials are great!
I learned so much from them. I watched almost all your AR tutorials and executed all the projects!
I had a lot of fun watching and learning.
Thanks a lot and keep up the good work.
Please post more tutorials 🙂
It’s an awesome tutorial,but i was undering if it is posible to add adition text to user’s one, for example: he/she wrote: I like this games,and in the end an stabil #CompanyName?
Video with a new marker can not be played. The video with the marker provided can be played. Do I need to upgrade to Unity Pro to play the video? Thanks in advance. 🙂
With nyar4psg you will be able to track only square black markers, of course, you can make it your own, but nothing alike images. Such marker-based tracking won’t be so robust.
I did everything step by step but my videos wont play.
It shows up but it gives me the x image and when i click on it it gives me the loading image forever.
How do i get my video to actuall play?
Hey i tried doing the same…. but my Unity Crashed when i am trying to add ImageTarget. I am using Unity-5.3.3f personal. Can You tell the version of unity you using. So i can follow your video’s.
Hai Admin,,,
I try processing in Ubuntu but I don’t know how to import library Nyar4PSG,, I try to create and copy in ~/Documents/Processing/libraries but it doesn’t work correctly
Thank you for the tutorial! I am working on a app that is going to be using Text Reco and Cloud Reco. I have a couple questions that I am hoping you could answer. For starters When I run it on unity I the space that can actually read the text is really small and not that forgiving whenever I move the text. I was wondering if you knew of a way to make where it reads the text larger/ more forgiving when the target/Phone moves? Also I was wondering If you knew anything about cloud recognition, I tried using the vuforia tutorials but they are out of date and no longer work and the newest tutorial I can’t seem to figure out either. I’m assuming i’m messing up in some sort of way because when I look online nobody else seems to struggle. Any input would help! especially with the cloud Recognition if you can, thanks!
1. Virtual buttons is not working unless i am focus on to the button.
2. With out touching button it changing model base on my camera movements .
3. I changed max simultaneous tracked images 1 to 4 (each separate build in my Android mobile) .
4. Virtual buttons Sensitivity setting also changed from HIGH to LOW (each separate build in my Android mobile).
If you want i will send my Unity package file link also.
What steps need to be perform if we want a video playback on cylinder target ? I want to see a video, on cylinder like object instead of flat image marker.
I have already achieved to display a video on Image target.
In Image target case, we upload our marker image to developer portal database, but for this case assume that image marker is a sticker which is attached to bottle. I want to see video as I scan the sticker.
So, shall I upload that image target as a cylinder target image in developer portal database. ?
And what would be hierarchy inside unity project ?
In case of video playback on target image:
– ImageTargetStones (Parent) contains ImageTargetBehaviour.cs
– Video (Child of ImageTargetStones) contains VideoPlaybackBehaviour.cs
What would be hierarchy for diaplaying video on cylinder ?
Hi !
I did everything like you with Unity 32bit but when I click on start and show the target in front of my webcam, the 3D model doesn’t appear in AR..
hello,
thank you for this tutorial.
i try to this video, but i have a problem.
Assets/script/SnapshotShare.cs(7,17): error CS0246: The type or namespace name `AndroidUltimatePluginController’ could not be found. Are you missing a using directive or an assembly reference?
this error appear , did you know why ?
hi.. i just bought the plug in. But smhow i faced the same problem with sh. (i am new in this)
The same problem here.
Hey.
nice tutorials. really helped.
one doubt though. What is the basic difference between markerless and marker based AR. I tried searching it but I’m still confused. In this case if we are adding the image beforehand then how is this markerless AR?
would really help if you solve my doubt
In marker-based tracking we track only black square markers. In markerless solutions we can track image targets, faces, hands, fingers, finger alike objects, bodies, etc.
Great…I need this tutorial thanks a lot.
Can you give more tutorial to rotate the car with button left and right ?
i already make the udt but i can’t rotate the object.
Thanks :)))
Hi, I still don’t know how to install distributed library into the program by using nyar4psg, I have google a lot but I couldn’t find out any thing, could you show me how?, thanks.
Hi Kiran,
Did you find out how to play video continuously when target is moved out of camera’s focus? We would like ideally to use Vuforia to only trigger video player, so it comes out of the image and turning/moving towards the screen finally getting into the place. Once it’s in the place we can touch play button for video to play in the full screen mode. Also would be nice to close finished video and return to the targeting mode to trigger another video from different image. Any help would be greatly appreciated. TIA.
Only video preview in full screen mode would not depend on the tracking state.
About other needs – there is no easy description how to do so, you just need to code, but I don’t think you’ll be able to have some additional buttons (from your side) when the video is in full screen.
thanks a lot for the tutorial. I’m having the same webcam problem, where it seems like you need 32 bit version, on the latest unity version there is no 32 bit version what I could do?
Hi … wonderful tutorial. However, I followed all of the steps you explained but the camera can not detect the 3D object even on the textured sheet. Can you please help me with that?
Hey, i have the same issue. I followed the guide and implemented the app on an nexus 7, but the object can not be detected. I did not do any modifications on the code, so i don’t get it why it is not working.
hi, i followed your tutorial, and i couldn’t find AppManger.cs and SceneViewManager.cs in /Asset of Unity.
could you tell me how and where can i find it.
J espère que ce projet sera possible sur smartphone et vous créez tout les monstre de yu gi oh car sur l application ( androdisc ) il a été seulement 60 monstre et j espère de savoir comment je peux construire ce demo et merci
So basically you mean i just have to put my video in the appcontent folder instead of your augmented_reality_technology?
By doing this can my video play instead of the video provided by you?
I tried it but when i play it through mobile the moment i click on the screen to play the screen goes black…any specific mistake that i am doing ? can you please tell me?
I have tried your tutorial but when I moved the object away from the camera, the interface stays on the screen but in angled position. How do you fixed this? Is this something to do with the script? Please let me know, thanks.
Same here! Would love to know how to fix it. So It can only pop up when you point towards the track. Already tried putting the Canvas inside Image Track, and it does not work. Thanks for the tutos!
thank you for your tuorial, it’s very helpful
i followed your tutorial, but i i’ve an error like this :
Error attempting to SetHeadsetPresent
UnityEngine.Debug:LogError(Object)
Vuforia.VuforiaAbstractBehaviour:SetHeadsetPresent(String)
Vuforia.VuforiaAbstractBehaviour:Start()
Hello , i download the source code and tried running it on my android device . However , onlly 2d ground image is being displayed on tageting at image target . Any clue why that might be happening . Please help soon as possible . Thanks
Hi Edgatas Art,
First of all thanks for the tutorial series. I have a question that in this case I think that although we are moving the tracking image by our hands but it remains stationary in Unity scene and it seems that the AR Camera is moving.
What I want is as I move the tracking image by my hands. I want the 3D object placed on the image to move along with it in the 3d space.
Hey man, i’ve tried this tutorial and it works, but now i got a problem, the warning says “trackable userdefine lost” and the object doesn’t show up when i click the button.
can you tell me how to fix this.
Hello Edgaras,
thank you very much for your tutorials.
I tried this with my own video and it works perfectly.
I also changed the orientation of the video by Selecting VIDEO in Hierarchy and changing the X Scale value from 0.1 to -0.1
I have a problem when I pause the video and play it again: the music start from beginning but the video remain blocked.
Where is the problem? Maybe because I stream an MP4 video instead of M4V?
Hello
I am a fan of your page
In Tutorial No. 39 you put some jpg images as example .
How do paragraph colcoar OTHER jpg images is no unity ?
Put some tried but HE DID NOT accepted
when i import videoplayback package, i got these error:
Assets/Common/MenuOptions.cs(10,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
Assets/Common/SplashAbout/AsyncSceneLoader.cs(7,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
Assets/Common/SplashAbout/LoadingScreen.cs(10,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
I’m just building the unity 3D project on my Samsung and the panel and buttons are not appearing. They appear when I test it using UNITY 3D but not on my phone.
Do you have any idea of what could it be?
Eres genial, tu contenido es digno de una clase de maestria, este juego esta demasiado bien y tiene mucho potencial de diversas maneras, pero creo que te hace falta el manejo de marketing digital. si necesitas ayuda con eso yo se un poco jejeje, espero que sigas haciendo este tipo de contenidos y espero que tus proyectos sean un exito.
If 3d model are rigged then you can do that with static model (without animation). Head move together with head bone. For all model (parent gameobject) left/right rotation just use one of the rotation methods (RotateAround, eulerAngles). 🙂
please I’m student and starting my GP i need help i wanna know how to start with augmented app with android devices i wanna use android studio , step by step my idea is face tracking too
can u plz help me ?
I have the same problem as:
” I did everything step by step but my videos wont play. It shows up but it gives me the x image and when i click on it it gives me the loading image forever” also I can’t find this file “AppManger.cs”. Any idea I use the latest unity and Vuforia plugins
I noticed there were questions on the videos being inverted upon tests. Mine is doing the same. I have tried all suggested. Can any help regarding where the proper axis change is made?
Current setting for ImageTarget is: X -0.1 Y 0.1 Z 0.1
hi, thanks for your tutorials. These tutorials are great help for beginners.
I’m facing a small problem please guide me through, when i press arrow keys player animate perfectly and rotate also but didn’t move physically on plane, animate only on fix point.. Thanks in advance 🙂
Your website is awesome. I discovered it like several months ago, but always thought that this requirement of having a target image is somewhat cumbersome. Thank you very much, sir!
I ‘ve tried to make video playback like this. but the unity said that “IsampleAppUIEventHandler” cannot be found. it’s because I dont have that file in my project. so where I can get that file ???
FIRST congratulate BY ITS TUTORIALS this note 1000, I’ve been doing this now put in my Unity 32 and 64-bit generate an error when starting the camera , already put the api key editor and still generate an error with the name.
I FOLLOWED YOUR TUTORIAL.ITS VERY EXCELLENT. BUT IAM NOT ABLE TO CONTROL THE ANIMATION.IN GAMW VIEW ITS VERY LARGE.CAN YOU PLEASE EXPLAIN HOW TO CONTROL THE ANIMATION?
Thanks a lot for such useful and detail instructions! I’m just starting exploring how to create AR with Vuforia and Unity. And these tutorials definitely come in handy 🙂
I tried to follow this tutorial. But unfortunately there no such property for a button (like in your video 7:11). Here’s a screenshot what I see: http://prntscr.com/c5cqi4 . There is no init() function. I tried to use start() instead but it didn’t generate that 2nd script where you change some code (from private function to public).
I’m using Unity 5.4.0 and Vuforia 6 (tested on v5 as well).
Can you please explain me what I’m doing wrong and how to fix it? Thank you so much in advance! Hope you’ll find time to answer.
Keep up doing awesome things! 😉
Failed to load ‘Assets/KudanAR/Plugins/x86_64/KudanPlugin.dll’ with error ‘操作成功完成。
‘, GetDllDirectory returned ”. If GetDllDirectory returned non empty path, check that you’re using SetDirectoryDll correctly.
Kudan.AR.KudanTracker:GetPlugin() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:203)
Kudan.AR.KudanTracker:GetPlugin() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:196)
Kudan.AR.KudanTracker:Start() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:220)
hi, may i know how caracter look each other? are you using LookAt in unity or what? becouse i want my caracter look each other , but still not found how.
Hi There, I am playing around with this and am wanting to have 5 pages instead of 3. For some reason when I add two more pages, the swipeimage script seems to malfunction, not allowing me to swipe at all. Any thoughts? I adjusted all the parameters I could think of to account for the new pages but I didn’t mess with the script at all. Would it need modification? It didn’t seem like it should…
hey, thanks for the tutorial…but the share button does nothing and all the other buttons work. I bought the plugin and followed the tutorial, is there a permission I should be adding or something has changed?
I have followed the instruction as above. However the plane could not automatically disappear unless I clicked it. After I click to disappear the plane, the cube or sphere is not appear. Please give some advice.
I am using Unity 5.2 and vuforia SDK 5.5.9 .
Hi Edgaras, I’m very interested your ar technique like in video, if possible could you make a tutorial or share some information that i can lookup about this.
I also in area of unity could you tell me how that you convert 2d coloring texture to map on the 3d model. please!
thank you!
Hi Edgaras, I’m very interested your ar technique like in video, if possible could you make a tutorial or share some information that i can lookup about this.
I also in area of unity could you tell me how that you convert 2d coloring texture to map on the 3d model. please!
thank you!
When i scan a plan using camera and loading bar, model is getting loaded in another plane. Is there anything i would have missed or got messed up? I followed your tutorial clearly!
I follow your tutorials and they are great.
I have a problem.
I am using unity and vuforia (user defined target).
I am recognizing objects as targets (followed this tutorial), but my virtual 3D object and canvas are unstable, and the extended tracking doesn’t work like in image targets.
Have some experience with this, does this happened to you sometime.
I will explain, I have a sculpture to recognize, and I tried AR-Media object scaning solution, but is the app becomes slow and also unstable, that is why I am using user defined targets to overcome my problems with object recognition.
#region PUBLIC_MEMBER_VARIABLES
public string TitleForAboutPage = “About”;
public ISampleAppUIEventHandler m_UIEventHandler; (The type or namespace name ‘ISampleAppUIEventHandler’ could not be found)
#endregion PUBLIC_MEMBER_VARIABLES
#region PROTECTED_MEMBER_VARIABLES
public static ViewType mActiveViewType;
public enum ViewType { SPLASHVIEW, ABOUTVIEW, UIVIEW, ARCAMERAVIEW };
I am running NyAR4psg/3.0.5;NyARToolkit/5.0.9 in processing 2.2.1 with a Microsoft LifeCam HD-5000 on windows 7. When I run simpleLite, the background (camera) image appears only in the upper right corner of the window. It shows the lower left of the camera view. If the background image was correct the tracking appears to be correct. I looked in the reference material and found public void drawBackground (processing.core.PImage i_img)
This function draws the PImage to the background. PImage draws in part of farclip surface +1.
This function is equivalent to the following code.
:
PMatrix3D Om = New PMatrix3D (((PGrapPGraphicsOpenGLhics3D) G) .Projection);
SetBackgroundOrtho (Img.Width, Img.Height)
pushMatrix ();
ResetMatrix ();
Translate (0, 0, – (Far * 0.99F));
Image (img, -Width / 2, -Height / 2);
popMatrix ();
SetPerspective (Om);
:
My approach was to sub this code in for the line “nya.drawBackground(cam);” then mess with the translate to correct the issue. But I get a “syntax error, maybe a missing semicolon?” – I added a semi-colon to the end of the second line SetBackgroundOrtho (Img.Width, Img.Height); and It still hangs on the first line with the same error.
Any help would be appreciated.
Hi,
Please help me.
I downloded the Augmented Reality Vespa User Interface – Mimic No. 1.. Really this is only interface. So I don’t test the projekt.
Where can I download the motor image?
Hi, i need to know if i have to buy a 3d sensor camera for built a game with smart terrain or i can use the traditional camera of my smartphone? Thank you
Thank you for providing this nice platform, We are looking for a really good developer who can develop this paint functionality for us, We are already working on our product and need to integrate that part in it ( we are using Unity3D, Vuforia, C#).
The basic requirement, app should recognize/read the colors from the marker and apply it on the model itself.
Looking forward to hear from you soon.
Regards
ABID
P.S. I’ll be submitting few cool AR demos to this site, very soon 🙂
I’m having an issue with the screenshot aspect ratio. When I take a screenshot (in landscape or portrait mode), the image comes out vertically stretched (or horizontally squished). I tested it in 3 android devices, same in all 3. The images come out normal when i take a screenshot in unity on my mac.
After a lot of research, I still can’t figure out the cause.
Do you have any suggestions?
Thank you so much for the tutorial! Really appreciated it. 🙂
But anyway, do you have any idea on how to reset the distance value once it is on “OnTrackingLost”?
Cause everytime I need separate it first(during scan the object), then the particles effects will be destroyed.
Or else, it will still remain on top of Image Target when I scan for the second round even I didn’t connect the paper.
I would be greatly appreciated if anyone could help with this problem. Thank you! :)
When I scan only one part of the image alone for the second round, the particles still sticks to the image even I didn’t pair up with another image target.
I’ve tried few solutions, but it seems too many errors come out,
One of it, I tried put parts of these inside OnTrackingLost() section,
”
string NameTarget = “imageTarget_” + mTrackableBehaviour.TrackableName;
GameObject target = GameObject.Find (NameTarget);
transform.position=new Vector3(0,0,0);
”
in order to reposition the sphere back to normal position when tracking lost, but it seems like not working cause I’m not pretty good in c# coding,
Hey Guys, i have used your tutorial to make the simple video playback app, and its working great, i just want to know, how we can change size of video appearing after tracking the image target ?
Please i need help
Hello, I have follow all the tutorial properly, but when I connect my laptop to kinect, the picture won’t open. I don’t know what happen, do you know to solve this problem?
CS0246 C# The type or namespace name “AndroidUltimatePluginController” could not be found (are you missing a using directive or an assembly reference?)
I try to run your project in unity 3d its amazing! Thankyou. But when I built it to an application, it can’t detection my webcam. Do you know why? Please give me an answer, thankyou
context.enableUser(); when playing this sketch at this line this error showing “The function”context.enableUser();” expects parameter like”context.enableUser(int);” ”
Please help me to retrive
I never success create AR files using vuforia and unity.. i use desktop pc dont have any camera can i do it with this specification Desktop PC, win 10, 16 gb ram
Assets/VirtualButtonEventHandler.cs(5,14): error CS0101: The namespace `global::’ already contains a definition for `VirtualButtonEventHandler’ What about this Error……………?
hi sir
your tutorials are great. thanks for uploading…
can we integrate 2 or more videos with single Image target and make next and previous buttons to change between videos…
is it possible ?
thanks in advance
Hi there,
Thank You for the Video
I am new using Unity and all these stuff
I followed each step
But I had an error after i removed the Utility folder.in Minute 10:34
This is the error:
Assets/ZigFu/Scripts/Viewers/ZigDepthmapToParticles.cs(19,13): warning CS0618: `UnityEngine.ParticleEmitter’ is obsolete: `This component is part of the legacy particle system, which is deprecated and will be removed in a future release. Use the ParticleSystem component instead.’
Assets/Vuforia/Scripts/Utilities/VRIntegrationHelper.cs(99,29): warning CS0618: `UnityEngine.Camera.SetStereoProjectionMatrices(UnityEngine.Matrix4x4, UnityEngine.Matrix4x4)’ is obsolete: `SetStereoProjectionMatrices is deprecated. Use SetStereoProjectionMatrix(StereoscopicEye eye) instead.’
Assets/Vuforia/Scripts/Utilities/VRIntegrationHelper.cs(100,30): warning CS0618: `UnityEngine.Camera.SetStereoProjectionMatrices(UnityEngine.Matrix4x4, UnityEngine.Matrix4x4)’ is obsolete: `SetStereoProjectionMatrices is deprecated. Use SetStereoProjectionMatrix(StereoscopicEye eye) instead.’
Assets/ZigFu/SampleScenes/_Data/Scripts/MenuController.cs(75,16): warning CS0618: `UnityEngine.GameObject.SetActiveRecursively(bool)’ is obsolete: `gameObject.SetActiveRecursively() is obsolete. Use GameObject.SetActive(), which is now inherited by children.’
Assets/ZigFu/SampleScenes/_Data/Scripts/MenuController.cs(79,16): warning CS0618: `UnityEngine.GameObject.SetActiveRecursively(bool)’ is obsolete: `gameObject.SetActiveRecursively() is obsolete. Use GameObject.SetActive(), which is now inherited by children.’
Assets/ZigFu/SampleScenes/_Data/Scripts/MenuController.cs(87,16): warning CS0618: `UnityEngine.GameObject.SetActiveRecursively(bool)’ is obsolete: `gameObject.SetActiveRecursively() is obsolete. Use GameObject.SetActive(), which is now inherited by children.’
Assets/ZigFu/SampleScenes/_Data/Scripts/MenuController.cs(91,16): warning CS0618: `UnityEngine.GameObject.SetActiveRecursively(bool)’ is obsolete: `gameObject.SetActiveRecursively() is obsolete. Use GameObject.SetActive(), which is now inherited by children.’
Assets/ZigFu/Scripts/Viewers/ZigDepthmapToParticles.cs(49,32): warning CS0618: `UnityEngine.ParticleEmitter’ is obsolete: `This component is part of the legacy particle system, which is deprecated and will be removed in a future release. Use the ParticleSystem component instead.’
Assets/ZigFu/Scripts/Viewers/ZigDepthmapToParticles.cs(52,125): warning CS0618: `UnityEngine.ParticleEmitter’ is obsolete: `This component is part of the legacy particle system, which is deprecated and will be removed in a future release. Use the ParticleSystem component instead.’
Assets/ZigFu/Scripts/UserEngagers/ZigEngageSingleSession.cs(6,23): warning CS0649: Field `ZigEngageSingleSession.EngagedUser’ is never assigned to, and will always have its default value `null’
error CS1705: Assembly `ZDK, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null’ depends on `OpenNI.Net, Version=1.5.2.7, Culture=neutral, PublicKeyToken=6b43d0c6cf74ee7f’ which has a higher version number than referenced assembly `OpenNI.Net, Version=1.4.0.2, Culture=neutral, PublicKeyToken=6b43d0c6cf74ee7f’
C:\Users\Esra\Documents\Zigfu\Assets/ZigFu/Scripts/_Internal/ZDK.dll (Location of the symbol related to previous error)
When i tried to run the demo i got an error vuforia initialization failed but when i opened the file in which the error was reporter visual studio is not reporting an error?
I am using Vuforia 6.2.10 in Unity 5.5 (This may be the problem but there are no error reports to indicate that something is wrong). What could be the problem?
Well now i have but a simple problem – kinect is not registering my face and is constantly running the default animations – could this be a problem with lighting om my face or could it be a problem with the scripts (kinect not getting the required info) ?
PS: Thance for a quick reply 😀
[…] OK, so back to what we do next with is “interface with our characters”. That’s right, I brought you to life, now speak to me! Love me! Give me attention, Pooh Bear, I MADE YOU! The user interface of augmented reality, especially for eyewear, is being developed now, and is ot yet ready, so, sorry but Pooh is still too dumb to play. The technicians and UI developers are working closer than ever as we enter an age of seamless integration between the game engines like Unity 3D or Unreal and SDKs like Vuforia. The SDKs help you get started from developer portal to final product. Check out tutorial here. Additionally, you can download Augmented Reality User Interfaces here. […]
When I add select the Business Card for my image target, it does not display in Unity as that image? I have also not been successful adding my own custom images.
hello
can you clone this app !
i mean can you create a source code similar to this game
i will be the first buyer :p
i think it’s easy for you because you already makes an ‘Augmented Reality Coloring App’
but can you make a complete project for a coloring book with 6 pages for exemple
Hi, I download the whole project to try this. But it has error in the script PolygonField.cs line 96: lineRenderer.positionCount = go_points.Length; It says “Type `UnityEngine.LineRenderer’ does not contain a definition for `positionCount’ and no extension method `positionCount’ of type `UnityEngine.LineRenderer’ could be found. Are you missing an assembly reference?” Could you please help me to solve this problem? I am using Unity 5.5
When trying to set the image target to the pre-decided target, the image only comes up as white. I’ve tryed various photos and digitial drawings as well as your target. Do you know wha the problem might be?
hello, can you say is it possible to change 3d model in the app. For, instance I will press button “change model”-> I will have to choose object-> this object will be in app.
Great tutorial! I just wanted to know whether there is any target limit in this case. I am planning to have about 50 targets in my project and is it possible? Thank you very much
hey i want ask something,when i change key license wikitude camera with my license still not work camera on my webcam,but when i change it into vuforia it can open camera on webcam,it seems error on wikitude camera maybe,isnt i true?
Hi – Just wondering if this works in 5.6.3?
Your project file works in 5.6.3, however, when I build my project based on your parameters, I can’t seem to make it work at all:)
Is there an update?
How i can implement the virtual buttons like Air Tap not a static buttons because the number of buttons changes with me on runtime and when a user point their finger above the button so i want to call the function of that button.can you help me thanks
Hello,
Thank you for your interest. It works on iOS platform. Currently code is adapted to this specific Vespa model, so most likely you will have to make minor code modifications. Documentation is provided within the project after you buy.
Regards,
Edgaras Art
Hallo I have just sent a comment but I cant see it here?
I am following your Blog it is fantastic
I am making an Augmented Reality app, I want a help in my code please
how to make a VirtualButton When it pressed It will play specific animation on the charachter
I want to make 3 buttons with each one there is different animation.
thanks in advance.
When I first created the tag “Model” and ran unity, everything worked! However, I then added another tag and called ‘Model1’ and deleted my tag ‘Model’. I then realized that the tag ‘Model’ was part of the code. I added back tag ‘Model’ and assigned it back to my model. However, the buttons don’t work anymore. I only have 1 tag in my tags, and that is ‘Model. Please advise? Thanks!
Hi,Can you put a tutorial on object tracking and how to track object and using the tracking we can change the model color and our things .Please go through the
following link: https://www.youtube.com/watch?v=c_jFgEiFotE. Please help me sir,if you put tutorial on this it will make more helpfull for me.
Hi there,
First of all i want to thank you for publishing great tutorials. My question is: How can i establish the data transfer between arduino and an android device instead of the computer so that the arduion device can intrepret the data coming from the arduino(via bluetooth or wifi) and manipulate the AR model? maybe it would be too complicated to explain it here but do you know any tutorials or sources i can use. Thanks a lot.
Nice augmentation. I also saw your Model based tracking for car wheels which is based on VisionLib SDK. Do you think combining Vuforia VuMark tracking and Vuforia Model tracking in one Unity scene is possable?
There has been a discussion in year 2013 on Vuforia Developer Portal regrading User defined Targets and Image Targets in the same Unity scene which was not passable to run. I tried but also ca not get it to work.
If for example a model or image target is not recognized (due to bad environment conditions like bright or low light), the user should get the possibility to enable augmentation anyway. Hence he should be able to start User Defined Targets to track he scene and start augmentation.
That’s not entirely correct, how User Defined Target work. When you talk about User Defined Target you probably think more of a SLAM technique (which is more of what ARKit and ARCore offers), but this is not correct. With UDT you just can do the snapshot in real-time and augmented the content without predefined Vuforia image database.
In Vuforia you can enable or disable SLAM for all kind of Target Type independent of Type. So there exists Model Target with or without SLAM, UDT with or without SLAM. Image targets with or ….
The Question is if it is posable to have more than one target type in one scene. E.g. if Model target fails to be detect the user could press a Button that grabs a snapshot and augment the content to the current scene. For example for your car wheel demo: If it fails the user could align the wheel, presses the UDT Button, a snapshot of car with weel ist taken and the content will be shown attached to snapshotr. Of cuse it will not work for the next car. But at least or this one.
Extended tracking is not SLAM. The Question is if it is posable to have more than one target type in one scene. E.g. if Model target fails to be detect the user could press a Button that grabs a snapshot and augment the content to the current scene. – this would depend on the environment itself it would not work 100% in all cases.
The Question is not if it might detect the snapshot – that may depend on environment – the question is if it is passable by Vuforia Sdk. As I already mentioned in former versions of Vuforia sdk either nothing worked at all or the application just crash. But you already gave your answer with the first reply “I haven’t tried combining”. May be this would be a challenge for you to get it to work…
Alteranative model Targetcs could be used but again I need UDT as fallback
yes i want to scale object through my finger touch, plus i am also facing issue regarding the position of my object some time it lies in the air instead of appearing on a flat surface.
Lot’s of things has changed since then. If you use 2017.3 with integrated Vuforia, then some Vuforia files duplicates and you get error. I suggest you try to recreate freshly based on video, but using newest version Unity, don’t import Vuforia 6.2 anymore.
Hai.. Is there any tutorial trying to implement a way to produce a sound such as AR music instrument when a user put his finger on the virtual button. Do you think that this It is possible to do this ? Tq so much.
Hello, take a look at Unity3D + Vuforia plugin. Once you have thee tools installed, you can have something running in about 20 minutes (from Unity3D editor) that would wokr the same way on Android/iOS platforms once the solution will be exported. Good luck!
thank you very much for your tutorial!
I am a newbie concerning augmented reality, but may I ask you if it is possible to create a sphere instead a can!?
My idea is to show the earth (globe) with a geostationary satellite (augmented reality)..
So, if the globe is rotating, the satellite should rotate too.
But I am not sure how to start because the satellite has to be always over the same spot (for example Europe).
I think I have to make a connection between the 3d model (satellite) and the texture of my globe (Europe), but I don’t know how to do that…
I believe that the introduction of augmented reality in publishing and education is really a breakthrough. This stimulates and attracts the attention of students in the learning process. I think you will be interested to read about the role of augmented reality in publishing ( https://invisible.toys/augmented-reality-for-publishers/ )
Just checked on Windows Unity3D 2018.1.0 – works fine. Sure, it can be used for commercial purposes, just keep in mind, that you need to buy license, if you want to eliminate XZIMG watermark.
Hi, thanks for this work you show us. I have a question, I have created two scenes with models and the buttons to scale, rotate and change scenes. In the first scene, I have no problem, the buttons work properly, but when I change to the next scene the buttons don´t do anything and the rotate buttons work but the are spin around, only works the previous scene button. As I did in the first scene y tagged mi model as “Model” so, I don´t know what´s the problem. Please help me with this. Thanks in advance.
hey this is not from this video but i had to ask… how can i implement multiple virtual buttons in scene . basically i want to see the c# code for the same . thanks
Thanks a lot for the answer.
So if I pay, can i download the entire project to take it like a base for other project?
Also… Do you includes some guide or something to understand how it was builded?
After I pay, could you solve doubts and give some support if i´ll need?
Thanks a lot and sorry if i got many question but I´m so interested but I need to be 100% sure.
Hello, So if I pay, can i download the entire project to take it like a base for other project?
Exactly, I was making this project for such purpose. Also… Do you includes some guide or something to understand how it was builded?
Documentation is included within the project explaining the scripts that were used, how to build AssetBundles, where to change Vuforia client and server keys so you could work with your Vuforia Cloud Database (account). After I pay, could you solve doubts and give some support if i´ll need?
Of course, I’ll answer to any of your question you’ll have
Hello,
Thanks! I’ve been testing EasyAR a while ago I haven’t used Cloud recognition on it, but I don’t see a reason why it couldn’t be adapted accordingly.
Not the latest, but I’ve remade it from Unity3D 2017.3.1 to Unity3D 2018.2.4 version (on this latest version selection from gallery and screenshot functionalities are disabled at the moment – other than that everything else works smoothly)
Hi, I’m interested in what you have developed here, facial tracking. I would like to buy this project in unity to adapt it to what I really need. I do not see the price of that here, do you sell it?
Hello. I want to ask is that possible to create this application by using free version for learning purpose? I tried to built in android but it does’t works, it comes error. I looking forward to hearing from you. Thank you very much!
Hi. What you see here is a free version. Hard to tell you the cause roots of what is not working properly from your side, but I can assure you that this project works great and you will receive the support from me if you buy this AR project.
Can you create a short tutorial on how to plugin the free version AF-1.5.2-Trial into unity for android only? I’m really want to learn how to create it.
Hello,
I work for the Brandenburgische Technische Universität Cottbus-Senftenberg, based in Cottbus and at the moment in COCKPIT 4.0, a research project for innovative automation solutions for small series assembly in the aviation industry.
Searching for a solution I found your tutorials and I would like to discuss further with you.
Kindly contact me.
Best Konstantinos.
Hi. i am not getting any error but when i start play . canvas image is appear on screen and then i scan marker its disappear. and when i remove marker then again canvas image is appear. how can i solve this please suggest me.
and thanks for awesome tutorials i am following your channel last 1 and half years. thanks
Hi,
Thanks for the tutorial.
I am trying to do something similar using ground plane detection and only placing the dominos one at a time (not using the Add Multiple script). My domino base has disappeared and the objects fall from above rather than being able to place them where the base should appear.
Do you know why this might happen?
Thanks again.
Hi! Thanks for replying.
Falls from above to the ground plane. I also changed the scale to 0.2 because they were huge otherwise – I think that might be the problem?
hi I`m trying to download the content you put into unity but cant seem to for some reason , is there anything I need to be doing in particular to use the files.
I created Userinterface according to your tutorial. All buttons except Scaling buttons are working fine. In the scale down button, If we keep on pressing, the model will be tilted at one stage and be zoomed. How can we resolve the issue to have scaling down at minimum level without tilting.
Hello
You use “cloud image target” or images targets are already in the unity project?
Because I would like to add target images in my application using only vuforia web services without updating my application.
Hey, nice project but when i click on the link to download , it starts to download another project specifically the WikitudeMultipleTracking. So please fix this, others can have access to this project.
Thank You!
[…] This is Augmented Reality channel in which easy-to-follow video tutorials are provided. In some cases interaction devices and various Arduino-based sensors are used to make it possible to manipulate the virtual content in Augmented Reality. We share the knowledge. And you? Hit like button and share with everyone! More info on Augmented Reality technology: http://www.ourtechart.com/augmented-reality/demo-augmented-reality-technology/ […]
As mentioned in the description – such an AR solution is a perfect fit for events, where lots of people are gathering – it would attract a lot of attention to you and increase your brand awareness. It depends what message you have, people could have snapshots and share it on social networks with your watermarked brand. Of course, another option is to use as a Virtual Dressing Room, which means clothing shops would offer the possibility to try on clothes before clients buy. Such an innovative approach to a client would boost clothing company sells.
I want to create an AR card application that I automatically update each time I add a new client card.
How can I get there and how many different business cards can I add each month?
is it possible to do with cloud solution or firebase??
This Paparmali 3 project (app) would allow you to do that. Overall you can keep 100.000 (100k) image trackers in Vuforia Cloud Database. You don’t need Firebase for that, unless you have some other specific needs on your mind.
So far current clients didn’t had any issues processing payments over Paypal method provided within this website. At the moment only Paypal method is available. For more details let’s discuss using email.
Hello, so, I am finishing my thesis project, and for this I need to learn the way this app was made, I want to know how much they would charge me for teaching me how to do this, I am not a programmer, I am 3D modeling and I want to learn this from scratch. I am looking forward to your response.
if i want to add new target and content,
i don’t need to use unity 3d to add and update contents ??
do i need to host or buy a domain to upload my images
You can upload image targets and assign urls to content (url to a model (AssetBundle), url to an image, url to a video etc.) from iOS, Android apps and from the Unity3D as well. But keep in mind that you would need to export and publish these apps on your Google Play/App Store accounts. You need to have a place (server) where you keep all your stuff, unless everything is already on-line (youtube video, image from website, etc).
Image targets with metadata are uploaded to Vuforia Cloud Database – documentation is provided within the project how to set it up. Additionally to that, I upload target image to my server as Vuforia doesn’t provide direct access to uploaded target image itself.
Hello. Thank you for your interest.
I have just re-tested everything just in case: re-downloaded, un-archived and started the solution successfully. Are you sure you have enough space on your HDD/SDD?
Hi, I’m a photo booth manufacturer, in my country Brazil (South America). there is the interest of having a distributed of yours in my country, pabrico an equipment that runs its software.
Hello, I sell only software, not the hardware. The application runs on a PC + Kinect 2, you can download and test it. Link is provided in the description. For more details please contact me through contact form.
Hello. Yes, Unity and Vuforia. Actually, on the ground it contains a masking plane and the house parts are below. On touch house parts goes up (above the masking plane.
Hello, yes it does, the scene and documentation is provided within the project. Also, If you purchase it, I will provide an update of this project for a recent Unity3D 2019.1.x (or above) version within a week.
I would like to thank this service for a great product!
We bought and are very satisfied with the source code, there is all the necessary functionality, all that we need.
Watched a bunch of tutorials on YouTube, tortured standard packages, combined, spent a lot of time.
Incidentally exile on this service we on YouTube and found.
And here is ready with all the necessary functionality, even more than we need, we have already redone everything for themselves.
Play Market is already checking our release.
Support at a high level, if it is not clear how and what, you will be prompted where.
We bought the Assembly paramali 2 and while remaking it for themselves, we were sent an update paramali 3 without surcharge.
If gnawing doubt whether it is worth it-believe me, it is worth all the money spent on this source code, because it saves a lot of time.
I would like to thank this service for a great product!
We bought and are very satisfied with the source code, there is all the necessary functionality, all that we need.
Watched a bunch of tutorials on YouTube, tortured standard packages, combined, spent a lot of time.
Incidentally exile on this service we on YouTube and found.
And here is ready with all the necessary functionality, even more than we need, we have already redone everything for themselves.
Play Market is already checking our release.
Support at a high level, if it is not clear how and what, you will be prompted where.
We bought the Assembly paramali 2 and while remaking it for themselves, we were sent an update paramali 3 without surcharge.
If gnawing doubt whether it is worth it-believe me, it is worth all the money spent on this source code, because it saves a lot of time.
As for your question, firstly, you need to track body or to be more specific – body joints. By “tracking” we mean having 3 axis position and 3 axis orientation information of each body joint. Having this information you can then “put” virtual content on top of human body.
is that I want to add individual things like 1 shirt, 1 dress or 1 pants, could you?, and you when delivering the project explain that it must be modified in the mesh to make it work?, the suit is accommodated depending on the size of the person?, also serves as a profile or only of fr Entity?
es que yo quiero agregar cosas individuales como 1 camisa, 1 vestido o 1 pantalon, se podria?, y tu al entregar el projecto explicas que se debe modificar en el mesh para que funcione?, el traje se acomoda dependiendo el tamaño de la persona?, tambien sirve de perfil o solo de frente?
Yes, it is possible to add shirt, dress or pants, but it is important that they were rigged properly. I do not provide explanations how to rig outfits, that’s something that 3D designers/modelers are familiar with. And yes, the shirt/dress/pants or any other outfit size is adjusted to a person size. Not sure on what you have in mind with the last part of the question?
Hi
Great job with this application.
One question though, If I want to purchase the fitting application only (without models and effects) would that be a different price?
Hello,
Thank you.
As for your question: We do customize project(s), but only to a one direction – forward, by expanding the project and customizing new outfits. What you ask is a backward direction and we don’t sell empty projects.
Regards,
Edgaras Art
Hello, of course! P.s. I’m working on a new Face Tracking project using ARCore. That project will include 3 options: 3D masks, 2D face filter, camera effects. It will be much more advanced, It will be able to use all 3 at the same time, put some 3D hat, put, for instance, animated blinking eyes on your face + add some camera effect of an “old movie”.
You mean while the model animation is triggered? There would not be any difference from perspective of coloring whether it is a static model or animated one.
Seems Kinect 2 is out of production, Can I buy azure Kinect dk?
Really? Well you can find plenty of Kinect 2s’ on Amazon. Of course, I’m planning to add support for Azure Kinect DK in near future. Please keep in mind that Azure Kinect DK costs much more than Kinect 2.
Unfortunately in Italy there are none but because we’re planning to buy tons of it, we need something stable and reusable.
Did you had stability issues with Kinect 2? Is Azure Kinect DK already for sale in Italy?
Hello, is it possible to branch your project to adapt the source code to other RGBD cameras with different specs (eg: higher fps, or pixels) other than the kinect V2?
Hello. Unfortunately, no. It is possible only using Kinect 2 and once I’ll take my hands on Azure Kinect DK we’ll add support to it as well. At the moment it is available only in USA and they don’t ship to other countries.
Hi, you have done a great work. I am working on ARcore since couple of weeks. I have some doubts regarding it. I want to know how to create a package using arcore unity sdk for developing FACE TRACKING. It would be a great help if you can throw some insights on it. Thank you in anticipation .
Hi, I actually saw a video on youtube where we can create SNAP FILTERS in UNITY and deploy it into android mobile. They used “arcore-unity-sdk-1.13” version from google sdk. Where they have imported the google arcore sdk and used some default examples to get those filters. I want to use some other example features like identifying a human face and putting glasses. So, my question is how can I make an example package in google arcore.
Hello. The thing that confuses me in your question is, that you mention ARCore SDK. ARCore SDK has no relation to the package creation in general. If you want to create packages, take a look here: https://docs.unity3d.com/Manual/AssetPackages.html
Hi,
I want to build an arcore face tracking app as you did, but I want different types of spectical filters. Can you please guide me how will I get that.
I have tried to follow this tutorial . Everything worked fine except C# script . I am getting “The type or namespace name ‘VirtualButtonAbstractBehaviour’ could not be found (are you missing a using directive or an assembly reference?)” error in my script .
Do I need to add any dlls exclusively ?
Hi
congratulations!!!
i would like to have the code source of AR Vuforia Cloud Recognition, how to get it?
Sorry, i am frech speaker an dlive in Africa
Very Interesting Demo. I really appreciate you to publish it publicly. Can we use WEMOS D1 Mini board or NodeMCU board to replace particle photon board? If I use Wemos or NodeMCU, can you please give me a hint or guide to complete same project you published here. I just want to real time monitor temperature, humidity kind of parameters using AR based App and simple development board like Wemos. Actually I cant afford hololens or particle photon board. If you can help me , it would be so great. Thank a lot again
Hey can you tell how we can prepare the texture for 3d model to warp around it perfectly I am working on similar project but I need a steps to create a texture that wraps around 3d model in the way I want.
Hello, I am so impressed by your amazing work and when I was watching this video I was actually thinking how the back of the body looks like because atm it is showcasing all the front tracking. Thank you !
Hi, did you manage to get a build on UWP or Windows devices in general? Or did you only try it in play mode? If you managed to get a build, please let me know how!
Hey, I’m doing this kind of project for my final year thesis, and I need to do this from scratch, can you help me please ?, I ask many people but the cost is too much for me to handle, anyway thnak you so much
hi, I love your project. <3
I'm a beginner and want to learn more about augmented reality. I want to build an application on flutter and make something like this on unity3d, than integrate this app with flutter. Is this possible? I would love to hear from you 🙂
For the sake of your nerves and time skip flutter and build everything in Unity3D. In my opinion, no need to involve flutter in whatever you are building or you will go nowhere and your development time will sky rocket. In either way, everything is possible.
How do I download after payment?
I just paid for Paparmali 5 – SmARt Mirror (Virtual Fitting Room) – Kinect 2 / Azure Kinect DK Body Tracking €995.00 EUR
30/3/2021
Hi Edgaras,
Great work.
I am an engineering student and also working on AR in unity.But I am facing problems in making UI or interacting with UI which you have done in impressive way. Can you please guide me how did you make GUI for that app or specific approach you used.That would be a huge favour.
Has this project’s code Kalman Filter?
This one – no, and the sensor information is not fused in any way. Actually, I would suggest using quaternion implementation rather than Kalman filter like here http://www.himix.lt/?p=915 (sensor fusion is done here), when you use the array of 3 sensors (accelerometer, gyroscope and magnetometer). The use of Kalman filter would not provide noticeable improvements over quaternions (I’ve did lot’s of experimentation).
BUT! If you use only one sensor, for instance, accelerometer, I would recommend using Kalman filter.
I’m planing to make KF tutorial in near future.
Does this work for Kinect ONE?
Unfortunately, no.
RFSPI.cpp.o: In function `SPIClass::begin()’:
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:24: undefined reference to `SPIClass::pinMode(unsigned char, unsigned char)’
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:25: undefined reference to `SPIClass::pinMode(unsigned char, unsigned char)’
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:26: undefined reference to `SPIClass::pinMode(unsigned char, unsigned char)’
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:28: undefined reference to `SPIClass::digitalWrite(unsigned char, unsigned char)’
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:29: undefined reference to `SPIClass::digitalWrite(unsigned char, unsigned char)’
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:30: undefined reference to `SPIClass::digitalWrite(unsigned char, unsigned char)’
Please provide more info/details on this error. What did you do, did you copied the library folder into proper folder etc.
please…,make a pdf tutorial
Sorry, but I am not planning to. I’m thinking of writing an e-book, but it won’t be for free.
Yes!! An ebook would be amazing !
Can you please provide me with the details and pacakges of it.. rohit.gupta2267@gmail.com
Sorry, but I do not sell the book, you can try and search it on ebay, there’s plenty of other similar books related to augmented reality technology.
I dnt want book. I want the packages and that u used. Like games and all.. If you can provide them it will be really helpful.
This book goes together with DVD disc in which you find the Augmented Reality software. It might be games, it might be some other exciting things related to Augmented Reality. So this is only DEMO of the book with games that someone else developed.
I want to make final year btech project on agumnet reality so i need such things. You have also mentioned about the ultimate project, what is it ? Can you help me in building a project with me ?
I can tell you that Ultimate project won’t be available for free, it is something that I’ve been working and improving for several years, it’s a combination of Augmented Reality and Arduino.
Tutorials – is the way that I help. You can’t find anything that would fit your idea? What is your idea?
My idea is i want image recogination and text recogination connected with internet. Like if i want to know about a book , i just point my camera on the book cover and it will tell me the reviews. Its basically a part of Sisth sense technology developed by pranav mistry.
Image and text recognition is basically solved in my tutorials, but it is all predifined (images, text). If it is all predifined it would be easier. What worries – the search on the internet. But it might be that you want a little bit different application, for example, take any books cover picture, recognize it properly and make a search on the internet?
Yes exactly. By seeing weather it tells you about the weather.
Well, in my opinion, this is ahell of a work and I’m not sure how to make it possible.
the video shows just codes to read data from photo-resistor what about the image processing and other stuffs
Watch closely video, there are two parts, one for arduino and photoresistor, the other one for Processing and Augmented Reality while acquiring photo-resistor information.
what library does the processing software needs to run these codes
Nyar4psg, please watch Tutorial No. 1: http://www.himix.lt/?p=512
Hi! I’m new to AR and Unity but I have been a software developer for over 10 years. Thank you in advanced, I will try your tutorial. It looks like a lot of fun! 🙂
Thank you so much for the tutorial. It was really hard to find a working tutorial for the virtual button!
I’m glad I could help!
Hi and thank you! It works 🙂
https://www.youtube.com/watch?v=oHVXVJKUM6Y
Is it possible to control the mouse using (a-star 32u4 micro – a tiny arduino leonardo clone)
and MPU-9150 ?
http://fr.hobbytronics.co.uk/image/cache/data/pololu/a-star-32u4-micro-2-500×500.jpg
https://cdn.sparkfun.com//assets/parts/7/3/7/6/11486-04.jpg
I don’t know if it is a clone as you say, it might just work out for you. Just try it out and let us know.
It’s very straightforward to find out any topic on net as compared to books, as I found this post at this website.
Man,thanks a lot for this videos,i’am just a noob in AR and your videos are helping me a lot.
I’m glad!
Hello very good job I would like to know when you will go up the project I am waiting .A greeting and thanks
Hi, I’m working on it…
Brother can you help me in making virtual dressing room ? If we want to try any dress which is available in any of the shopping website.
Hi, I haven’t started this tutorial yet. But maybe you have some clothes I could use in the future for this tutorial? I won’t need to search it by myself.
Hello Admin , Is it possible to make a virtual room to try accessories 🙂 ?
If you ask generally, then of course it is possible. If you ask me will I do it? I will, but I can’t tell you whether it will be available as tutorial or DEMO. In the future we’ll see. 🙂
Will you do it for me if i pay you?
sure
hello sir, i m a student i want to develop my own project which is interior designing using marker less augmented reality how can i do that can you give me a demo on that topic which i can use as a reference for my project…. please give me a demo on interior designing using augmented reality…
actually i m very confused so it would be really great if you help me out so please let me know how it work through your demo for interior designing augmented reality… i have prepared the marker based interior designing by taking reference of your demos but now i want to do it in markerless sooo please give me one demo on it please
This is where I suggest you start: https://www.youtube.com/watch?v=qfxqfdtxyVA
This is a markerless AR. Just start from adding your interior design content. No need for a seperate tutorial on this.
sketch_aug14b:32: error: ‘ADXL345’ does not name a type
sketch_aug14b.ino: In function ‘void setup()’:
sketch_aug14b:40: error: ‘adxl’ was not declared in this scope
sketch_aug14b.ino: In function ‘void loop()’:
sketch_aug14b:46: error: ‘adxl’ was not declared in this scope
pls any one say how to solve this error…?
i just download the library and paste it …still im getting this error… what i have to do…
reply as soon as possiable
Are you sure you copied library in the right direction? Could you paste me a path to this library?
After you copied library did you restart the Arduino itself?
suppose i want to take run time image directly to my application and user can place wherever he want how can i do tht ?? which platform will be suitable for my project?? suppose i want to take image from a online shopping site as input to my application and as output user will see how that interior look like..
Just to make it clear, you’re talking about the app using META glasses, right? By saying “runtime image directly” you mean that take a snapshot from camera with augmented content, and place this taken picture in any place you want in augmented reality?
Hello very good job I have this
Error DllNotFoundException : MetaVisionDLL
Meta.CanvasTracker.Release ( )
Meta.CanvasTracker.OnDestroy ( )
Would know how to fix it . Thank you
Sorry, haven’t stumbled upon this problem.
U used 64 bt unity!. use 34 bit version.
I would also like to try it on android mobile augmented reality Vuforia . Please help
So what’s the problem? I would start from here: http://www.himix.lt/augmented-reality/augmented-reality-android-app-export/
METAPRO GLASSES ARE REQUIRED TO PROVE THE SCENES ?
DOES NOT WORK WITH WEB CAM ?
Correct
Hello, I was wondering if I could use this library on a 2-axis accelerometer. I will download the library now and see if you utilize function overloading so that I can pass in only values for the x and y axis; but if not, do you have any ideas? I have a 2-axis accelerometer that is hooked up for I2C ONLY. Please let me know if you have any suggestions or advice. Thank you. -Joe
ADXL345 is a 3-axis accelerometer and using the library provided, you should not have problems in acquiring that information. I2C works perfectly for that.
Is there a tutorial for scanning a single image.
What do you mean by “scanning single image”? Recognize and Track? If this is the case, the basics are starts from here: http://www.himix.lt/augmented-reality/augmented-reality-using-unity3d-vuforia/
an amazing job your tutorials are excellent I would like to know when we have available the project. Thank you so much
Soon, but it will be only as a DEMO, not something that I will share. At least for now.
Instead of making confusion with words this is what i want to create an android application for interior designing with augmented reality….
https://www.youtube.com/watch?v=ipkz6y9mfvk
…. i want to create an android application in which the user can buy the furniture from the online shopping sites and by using my app user will be able to see the augmented view of those furniture at their place and if they like it than they can purchase it… i hope this time i m clear what i want to say…
Yep, it is crystal clear. You will have a hell of a work to do.
hmm ya i know its not so simple n so easy.. but i want to do it.. not for a particular output but for sake of knowledge please can you guide me for this?? i will do my best.. just guide me i will work hard on it… please..
My guidance is the tutorials I make. Just be patient and I think fragment by fragment you will be able to build what you want.
okay thanks… but i have to submit it as my college project so let me know from where should i start??? which platform will be suitable for my project and what should i do first?? just let me know that means i will be able to start my work …
Start here: http://socialcompare.com/en/comparison/augmented-reality-sdks
Lots of SDK’s for Augmented Reality app development. Try to find what fits your needs best, if you think Unity3D+Vuforia just not enough.
woooow so great thnx admin
You’re welcome
CAN you give me a way to do this: after track money paper and get alot of papers , when I zoom in the virtual papers I want to replace it with another image
…
thank u
I can’t run it, it says “The field PConstants.OPENGL is deprecated”.
Could you please hemp me make it work?
Hi. It’s really hard to tell me what is the problem if you did everything according to the shown instructions, did you tried google the error?
P.s. I hope you used older version drivers provided in this website not the newer ones?
I made it work, it runs perfectly now 🙂
The problem was that I installed Processing 3 instead of Processing 2.
That’s great!
[…] Don’t forget to subscribe as more cool tutorials awaits you! More information on this tutorial: http://www.himix.lt/augmented-reality/augmented-reality-and-leap-motion/ […]
what verson of vuforia SDK you use?
Right now the newest – vuforia-unity-5-0-5.unitypackage (33.17 MB)
Very Good
Great tutorial…Thanks Please send complete info asap…
It’s already completed.
Hi, greetings from Rio de Janeiro, Brazil! First, I wanna thank you for all your tutorials, I am trying to learn more about Unity3D since I started to watch your videos. But I have this question: If I want to use my smartphone as a stereo glasses with augmented reality, does Vuforia generate an output app for this? Thanks once again and I hope your ideas help to transform our world into a better place!
Hello, Ricardo. Thank you for kind words.
Actually I don’t know the answer, I haven’t done anything alike so far. I mean I didn’t tried to use smartphone as AR glasses. But if you find some useful info on internet later on while researching, please let me know, I’m interested in everything related to AR.
Hi i have one query after taking screenshot this screen shot is not save into gallary.so is their any solution for this.
Yeah, it’s not in the gallery, but it’s somewhere in your device’s memory. If you know how to modify this code in order to send the pictures to the gallery, please, let us all know! So far I achieved this (saving pictures to gallery) only by using Unity3D assets/plugins, which comes with a price.
waiting for eager new tutorials. congratulations for the work done
Hi. I have downloaded Processing 3. When I run simpleLite the following error is shown in the console. “No library found for processing.video
Libraries must be installed in a folder named ‘libraries’ inside the ‘sketchbook’ folder.”
Any Solution
Hello, download 2.2.1 version of Processing (https://processing.org/download/?processing) and you should not have the following problem.
waiting for the new tutorial 35 very good job. Greetings
Thanks, currently I’m waiting for free time 🙂
Hi,
I don’t have any camera on my computer, athough I would like to use the camera of my mobilephone, is it possible to make the program search for IP address of a camera?
Cheers
Hello. Nothing is impossible, however, I never tried to do this. Trust me, you don’t need this additional problem solving.
Hi! Great work. Any chance for wikitude in October?
Hi, not yet.
i downloaded your project from this site
only one object is view at a time i want to run as in your video
i run app in android mobile any setting require to run app in multi object mode
I would suggest to start recognizing Kuriboh card firstly, after that blue eyes white dragon. Actually, Kuriboh card has not so many good tracking features. Before using the app on android did you test in Unity3D play mode? I would suggest you do it so and check it out how well multitracking is accomplished.
Hello Edgar, I like this sample. Actually great work with all of tutorials. Do you plan make some tutorials also for Google glass ? Thank you
Martin
Thanks! Actually I don’t plan to make a tutorial on google glass as I don’t have them. I have AR META glasses as I’ve seen more potential in it. Who knows what future will bring 🙂
hello, i been trying to making Cylinder Target based on your tutorial
and while i’m trying upload the SIDE image on vuforia target manager, it failed saying the image Euforia of Beauty Logo dont match the dimension, cant you post the tutorial how to measure the image so it can be uploaded on target manager
Hi, I’m not going to make a tutorial on this. I’m sure you will find your way out to upload the image in proper dimensions.
I’ve added rescaled image (Download # Print Euforia of Beauty Logo to Augment the Content and Create the Tracker for Cylindrical Object (*.jpg file)), you can try to upload once again.
you are so great!
hello, i tried to make rotation button on adroid
but its always looping
the only way to stop it, is hold the button
but if i release the button, its looping again
Most likely you did something differently than I showed in tutorial.
Thanks a ton for all these knowledge sharing and I have become a serious follower of your tutorials and has also subscribed your youtube channel. These tutorials are great assets for people like me who are getting in to AR field.
Thanks for kind words.
Hi I am totally new with AR and any software developer ( not a programmer at all), Thank you, I test it out and it works. But I have a question, If I want to use my own 360 picture, how can I upload it and use it?
I’m not sure what you mean “360 picture”, but it is shown in this video how to upload the image. Just change the Logo to your own picture.
Hi, Thank you for the tutorial, it is a really good kick start for someone who is totally new and want to learn like me. However, I have a question, after doing everything you did in the video, when pressing play, how do I like it to my android? Are there any videos you did which explains?
Hello,
What do you mean by “how do I like it to my android”.
Sorry, I meant link* it to my android. typo.
You just need to export it for Android OS like shown in this tutorial: http://www.himix.lt/augmented-reality/augmented-reality-android-app-export/
Hi,
Thank you for the tutorial video, it is a really kick start for someone totally new like me.
It would be great if you can help me with my question.
After doing everything you did in the video with a PC, and press play button, how do I link the program to my android device?
Is this marker-based or markerless-based augmented reality ?
14th Tutorial is markerless.
this one is marker-based example: http://www.himix.lt/augmented-reality/augmented-reality-marker-based/
I have been trying to run this code for hours but for some reason the Arduino 1de ver 1.6.5 on windows 10 cannot find HMC5883L.h. I’ve placed copies of the library in the main library, the sketch book library and the hardware library but it continues not to find it. I also used the library manager. help
Hi, I have windows 10 x64, I tested it right now and it works perfectly. Make sure that the library directories looks something similar to this:
C:UsersEdgarasArtDocumentsArduinolibrariesHMC5883L_libraryHMC5883L_Example
not to this:
C:UsersEdgarasArtDocumentsArduinolibrariesHMC5883L_libraryHMC5883L_libraryHMC5883L_Example
Depending on how you extract the libraries, it might have two same folders one in another (HMC5883L_libraryHMC5883L_library\HMC5883L_Example) so I assume Arduino can’t recognize it.
it is possible to turn on a relay with a button on the html page?
Of course, the same way you would turn on LED. Easy as that.
Ok but where i can found a code
You can use this example:
http://www.himix.lt/arduino/arduino-and-ethernet-shield-for-home-automation-control/
hello..
how can i make the button when click on it, the model will duplicate/added and another button that will remove the model..
tq…
Hello, this is easily done, but I won’t start coding for you here. I suggest you google Unity3d C# code on that.
Hi, I don’t know why but arduino software don´t recognize the ADXL345 library. The program shows it in black instead of orange.
The folder is in this directory: C:UsersLeyreDocumentsArduinolibrariesADXL345_library with the other libraries (which it recognizes well) and the folder ADXL345_library is not duplied.
Nevertheless, when compiling the software doesn´t show any error.
Any idea? I need help please.
First you said it does not recognize the library, but then you say that it “doesn’t show any error”. I don’t understand. How do you know that it can’t recognize the library without the errors written?
Thank you for answering.
The programm shows the library and all the functions related to accelerometer in black instead of orange whereas the rest are in orange. Isn’t that weird?
Nevertheless it compiles and I can upload the programm to the Arduino UNO Board (although the data adquired is really weird).
That’s really weird, it’s hard to suggest something right now, but answer to this question: do you really use GY-85 board? yes or no?
This is important because I had some weird data readings while using board with only accelerometer alone and I couldn’t find the solution for that.
Maybe colors of the text have no importance. The fact is that I’m pretty new using Arduino, I’m still learning basic things.
Another question, how can I get angles between -180 and 180 degrees? I have confusing data in the serial monitor
Try and search specific library for GY-85 on the internet.
Hi again,
I have been trying the code the entire day but i don’t understand the outputs.
I want the angles in degrees. On the one hand i get the rolldeg and pitchdeg between 20 and -20 instead 90 and -90 degrees.
On the other hand when i show anglegx, anglegy and anglegz i get signals which change even when the Imu is stationary.
I would like to add to your code a complementary filter but with those output data i can’t.
I have watched the video and it seems to me that output data shown are correct but i don’t get the same.
Could you help me please? I’m a little desperate because my project depends on an good measurement of the angles.
Best regards and sorry about the mess
If you would send me some pictures of the sensor wiring to arduino microcontroller and screenshots of error in the program maybe I will be able to help you.
Hi Edger , thanks for this great Tutorials ,
it had been a great help ,
i’m just wondering why you don’t had Audio in this Tutorials speaking for what you do and explaining it provide mush more help
I have my reasons.
Can you please create a video tutorial on developing mixed realities using vuforia and unity3d?
Thanks in Advance
Can you show me an example of what you really expect on “mixed reality” tutorial?
Thanks for your reply, I really need a tutorial on how to use vuforia and cardboard SDK.
The app would scan the image target to track the AR world, and the user would be giving a button that when looked at will teleport the user into the VR world and Vice verser. just like the vuforia sample.
Thanks,
would be expecting the tutorial soon.
Great idea, but I wouldn’t expect tutorial soon.
I am having the same problem after checking it in error this variable :
ADXL345 adxl; //variable adxl is an instance of the ADXL345 library
Arduino_AccelerometerADXL345_Servos:32: error: ‘ADXL345’ does not name a type
Arduino_AccelerometerADXL345_Servos.ino: In function ‘void setup()’:
Arduino_AccelerometerADXL345_Servos:40: error: ‘adxl’ was not declared in this scope
Arduino_AccelerometerADXL345_Servos.ino: In function ‘void loop()’:
Arduino_AccelerometerADXL345_Servos:46: error: ‘adxl’ was not declared in this scope
Can speak step by step how to resolve this error?
Check the path to the library. Wrong direction.
I could make it work.
thank you.
Hello sir
I want that sensor can you provide me link to get the sensor shock module. The sensor is very important for project of bachelor.
Thank you
You’ll find it on ebay with keywords “Shock-Knock Sensor KY-031”.
Hello! Student studying physical computing here. To run by “dropping in,” are there any constraints for dimensions/scale or file size for the OBJ model?
If your *.OBJ model size is huge, most likely, the model itself is really complex and has lots of vertices/polygons. I don’t know the exact constraints on this matter, it also depends on your computer specifications. You should sort this thing by experimenting with the models you have (if it’s really complex).
hello when I move the fast sensor, servo motor lock and does not follow the movement, and is making a noise (tec tec tec) in the servo motor, and then resetting it back to normal, you can say what can be ?
I’m using servomotor Towerpro MG946R direct 5v power by the Arduino.
Are you sure that sensor readings are correct?
as igniting 2 LED with applause and which lights decicir first?
Please, repeat the question, can’t understand it.
Hello! good day!I am a beginner and i need your help please help me. I downloaded ur code ,i have the adxl345 3-axis digital acceleromter and 2 Servo motors. i connected the ryt connections u’ve indicated above. But like other reviews say, there are some errors….what should i do sir? and where should i start…..i dont understand the library stuffs u mentioned..what library is it?where could i find it??and where to copy
Hi, from 0 to 12 seconds in the video, I’ve shown where to put the library folder. Do it so, and use the code. Good luck.
How do you make the application of augmented reality is compatible with Windows what programs are used
For starters, I didn’t make the app for this book, this is only a DEMO of what other developers did. I haven’t tried yet, but one of the users provided a link (Vuforia standalone) in order to make vuforia work on PC platform. However, I haven’t tried it yet and right now I can’t find the link it’s somewhere in one of the tutorials comments place.
Hi, I already made those examples but I have a question do you know about the Dragon Board 410c, I want to play mi app in those board but I installed the Ubuntu Linaro but my app doesn’t work with this … Do you know or do you export and play some of your apps with some other board ??
Sorry, but I don’t know, about the dragon board 410c I’ve heard only from you right now 🙂
HI
i will buy the meta dev kit1 next Time. I have a question to your work
with which Prgramm you make this videos ?
On the Videos the FOV is extrem Big ! Is this alone on the video or on the Meta to ? I have a bt-200 an this device have an smale FOV on the near distance.
Have you contact with META when appears dev kit 2.
I use Techsmith Snagit and Camtasia.
I suggest to compare the specifications of bt-200 and META glasses. I didn’t had the opportunity to use some other AR glasses (only META) so it’s hard to compare, but those who put those META glasses for the first time says that FOV is not so big. So, of course, watching video here, and using it for real is like day and night.
Can you re-specify the last question?
Actually META Glasses use moverio see-through display as their base display. So, the FOV will exactly same between both devices.
Actually META Glasses use moverio see-through display as their base display. So, the FOV will exactly same between both devices.
Hi ! Great tutorial but I have stuck on building part … I am doing same as you on movie but my apk is not working on android … it has black screen … I never developed for android and I belive I am missing something in setup. My projetct works great in unity preview, whole building process is ok to [no erroes] but after copying it on device and installing nothing going on. Can u point me in some direction. I need to run project on phone and all seems to be ok but this.
Thanks a lot!
It’s hard to say, but if it’s not “top secret” project send it to me and I’ll take a look. We can start from *.apk file. Maybe it’s a smartphone’s problem. Who knows. It’s also worth googling this problem.
Hi
Thank you so much… this is very useful to beginners like me.
I have a one question.
I tried this with two different object and tried on android device. The issue is my both objects are visualize from starting.. Buttons didnt work…
Did you put virtual buttons on textured tracking object? If not, I suggest to make it so. Don’t put virtual buttons on plane without any textures. I hope that helps.
i cant run file ,affter make like your video.
WEBCAM miss ???
http://www.uppic.com/uploads/14461898941.png
http://www.uppic.com/uploads/14461898942.png
Reinstall 64 bit Unity to 32 bit Unity.
Hallo,
thx for your answer.
I think Meta brings in the next Time a new Version of their glasses out. I think the name is DevKit 2.
Have you any information about this Release ?
THX
Oh, don’t know anything about it, I thought the next version will be for consumers.
Hi, I use Processing 2.2.1 and install . When I run simpleLite the following error is shown in the console like that(https://dl.dropboxusercontent.com/u/39808973/Screen%20Shot%202015-11-01%20at%207.52.58%20AM.png). Can you help me please?
Hi, I use Processing 2.2.1 and install nyar4psg 2.0.0 library. When I run simpleLite the following error is shown in the console like that(https://dl.dropboxusercontent.com/u/39808973/Screen%20Shot%202015-11-01%20at%207.52.58%20AM.png). Can you help me please?
If the library is in place, it’s really hard to say what else could be wrong here.
What is folder direction to your library?
Documents > Processing > libraries > nyar4psg
I can see that you’re not using Win OS so it’s hard for me to suggest something. Also, you’re using Processing 2.1.2 not 2.2.1 but I don’t think that this causes the problem.
Hi, thousand thanks for your code! i tried several not working but this works so perfectly!
Btw, I will need to interface with two sensor, I dont know how to change the code to measure for two, could you please help me please?
Thank you so much in advance..Have a great day!
Sincerely,
Caryn
I’m glad it works. The code is not complicated at all in order to add additional sensor, try to sort it out by yourself.
may I know what am I measuring in the code? is it vibration amplitude or vibration time ?
Amplitude
Thanks. can u tell me the amplitude SI unit?
No, I can’t. This is raw data readings from the sensor.
alright.thanks anyway
Hi,
i’m and Interactive Developer using Unity3d, i sincerely need your help on how to integrate Vuforia and Googlecardboard sdk in Unity 3d for Architectural Visualizations.
Great work with the tutorials.
Thanks in Advance
Hello, John. Haven’t done anything on that yet. But I will soon, you can wait for a tutorial (2 to 3 months), but if you want it to be quicker it won’t be for free.
Yeah!! already buttons are on Textured area..
But i received at the time of program run. It couldnt find the path of ICSharpCode.SharpZipLib.dll to Unzip the file. I couldn’t understood. I am not programming guy…
Can you help me?
Is Csharp is language supports Andorid?
Sorry it is difficult to me to understand script and do something on it..
C# is supported by Unity and using Unity we can export an app for Android devices (and other OS’s). I really haven’t stumbled upon this problem, can you copy-paste the directory to your project files. Have you tried anything simpler up until now? For instance: http://www.himix.lt/augmented-reality/augmented-reality-using-unity3d-vuforia/ I suggest you start from here and then move on.
After doing everything as in video in Unity 5.2.2 32 bit the unity chan model isn’t rendering in the video. What’s wrong?
I have to print the tracker before augmenting it. It can’t work with any sufrace?
Correct.
Can pls share d code?
Actually i needed the code with which u had drawn the rectangle around the maker and how to return the coordinates of corners of the marker.
The code is build in the library itself, so just start the example code and look for the code fragment related to marker corner coordinates.
Thank u very much ,just did not go through all d examples properly . Got it now .
How do apply log in account for META to get the tutorial?
Try here: https://www.getameta.com/ but I’m not sure whether you will be able to register if you haven’t bought a META glasses. Try it.
I think now it’s unavailable, they moved to META 2 quite a while, who knows maybe META 3 is on the way 🙂
Is dere anyway to add animated 3d models in this ???
Well, actually there is, but it would be more or less a “workaround” that I would not suggest to use. Now what do I mean? basically, you would need to export your model animation for every frame. So most likely you would end up having lots of *.obj files, where you would need to load these models frame by frame in void draw() part.
Option 2: search for a library that would be able to import animation file (*.fbx file extension). I couldn’t find for a better solution back in these days, but maybe it is available now. Who knows…some research is needed.
Option 3: Try doing AR tutorial No 14 that involves Unity3D and Vuforia, you can add animations there quite easily.
Thanx for your advice, actually i went through all your videos and the unity ones are quite cool . But actually i am newbie for unity and coding any logic into it is becoming difficult.
But i will try on the fbx part you told.
hi thanks for all tutorial ,
I made my ar apk unity+vuforia my mesh model is from my own sketch up models (import fbx to unity). when i try on pc is workin fine, but when i installed it on my smartphone it works but slow response and lag. any suggestion for size ar apk? (mine is 187kb)
next question… how to make quit button on app ..? thanks ..
Hi, I don’t think it has something to do with *.apk size, I would look into the complexity of your models or maybe you smartphone has some low specifications. Look here http://www.himix.lt/augmented-reality/augmented-reality-screenshot-and-sharing-on-facebook/ or search on google “quit button in unity”.
Please , I made exactly like the video but the Camera didn’t open and that’s appear
Any help ?!
http://www.uppic.com/uploads/14469334541.jpg
Use 32bit Unity.
Great!!!
Do you use Vuforia ? Do you need Network ? Tutorial ? :-)))
Yes, I used Vuforia. What do you mean by “do you need network?”, “tutorial?”?
hi,
how can I take another snapshot and not to replace the older one.
i want to create a snapshot gallery stored on SDcard.
thanks.
You will figure it out, I believe in you 🙂
please tell me . i can not figure it out..
one more.
now I’m develop ARVR app, is this snapshot button just to snap on-screen interface ?
is this work on stereoscopic interface ? i want it just a monoscopic(single) image saved.
*sorry for english
Everything is shown in the video? Doesn’t it?
network = 2 devices and 2 User’s Control one Character. I work on this but it is very heave to develop. The Syncronisaton must to be over Network.
Tut ? you make a Tutorial from this. Download Source ect.
I haven’t done anything on this yet and it’s not in my plan list to do it.
would be great if we could create a game based on that … could you make a tutorial? … I have many ideas to put into practice!
Maybe in a far future.
JoWeb can you do a tutorial on connecting via network ?
It’s not in my plan list.
But , can it run although my OS 64-bit ? and if it doesn’t ,there’s any other solution ?!
32bit Unity can run on 64bit OS.
Is there any way to run this on 64bit Unity?
I dont want to remove current unity and install the whole unity again ..
I hope there is, if you’ll find one – let us know.
is it possible to give us any info or hint about the making of this demo or can you please mention any reference can we used to learn scripting in unity and achieve same results
This tutorial http://www.himix.lt/augmented-reality/augmented-reality-fusion-effects-using-multitarget-tracking/
is the closest to this demo, I would start figuring out how this works.
Is there a way to paint to textures on walls/ buildings as a user moves his camera around a street anywhere.
If there is a way, I don’t know how to achieve it.
yes I know of course you using distance script……thanks……but the main part it’s how to modify the distance script parameters or how to write a new script me by using a playmaker or something else
what is the best way to learn to script in unity
go to unity3d website and watch tutorials.
Hey admin do you have any example which runs on Matlab??
Nope
I have found all of your tutorials very helpful. Great work.
Hello,
Is it possible to turn the buttons into virtual ones that can be pressed with your hand like you demonstrated in a another tutorial (No. 19)? I don’t succeed when I try to merge the code UI from this tutorial and the one of Virtual(Vuforia) buttons…
You already answered to your question.
I thought it’s just because I’m a lame coder… Is there a way to extend interactions with virtual buttons (Vuforia-Unity)? For instance, jumping to a next scene or playing a video?
Btw, thanks for all your tutorials, they are very helpful.
Yes, there is a way. The same way I switch models in this tutorial (http://www.himix.lt/augmented-reality/augmented-reality-virtual-buttons/) you can add different functions – load another scene and so on. Just dive in the code “VirtualButtonEventHandler.cs” (starts from case “btnLeft”: and case “btnRight”:)
I will try until I make it, thank you very much!
There’s one more issue no one could answer at vuforia’s forum: is it possible to trigger a whole environments in which the viewer can dive?
Should I use extended tracking (the triggered image would be much bigger than the marker) and keep the image target active even when tracking is lost (can I just disable this function to keep it on even if I turn the device/camera in other direction)?
I’ve read about different plug-ins like Unified Coordinate System that could help building augmented environments… Could you point a direction I should go?
Cheers!
It is possible, but I haven’t done this in Unity. Basically what you need I did it here with MARG sensor (http://www.himix.lt/arduino/arduino-and-virtual-room-using-mpu-9150-marg/), just with pictures and without tracking any image target. Same stuff applies to smartphones and tablets. I haven’t heard anything about the plugin you mentioned.
hi.. how can i connect it to the ethernet shield?? and what could i established for an ouput device using flame sensor?
http://www.himix.lt/arduino/arduino-and-ethernet-shield-for-home-monitoring-over-internet/
http://www.himix.lt/arduino/arduino-and-ethernet-shield-for-home-automation-control/
are you serious.
On assesst store, leap motion core assest’s version is 2.3.0. But you give a 2.3.1’s link?
I am serious, currently there’s probably even a newer version. This asset was downloaded not from unity asset store, but from leap motion companies website.
Sorry, man. Forgive my offense.
You are right. assest store’s core assest is old. And there is indeed newer version on leap motion website.
Haii .. Thank you so much for this awesome tutorial . By the way, can a particle system be controlled by our 3d object instead of using arduino ? For example, when i click 3d object such as factory, the particle system for an example, smog will emitted.. im trying to make an interaction with my AR project. Really hope you can help me, tho. thank you.
Yes, this is possible. But you will have to sort this thing by yourself.
Oh I guess so too. But I don’t think I can make it by myself because I’m soo bad in coding etc.. Anyway thank you for replaying.
Just wanna say you are awesome!! Thank you very much for all of tutorials
Thanks.
Hi, first of all thank you for all of your tutorials.
could you use cursor highlighted tool in your future tutorials?
Can you suggest any, maybe that you use?
[…] http://www.himix.lt/augmented-reality/augmented-reality-user-interface-unity3d/ […]
[…] Basic tut (1st unity #14) http://www.himix.lt/augmented-reality/augmented-reality-using-unity3d-vuforia/ […]
[…] http://www.himix.lt/augmented-reality/augmented-reality-user-interface-unity3d/ […]
can i implement the app on tablet connected to external camera ????
and how you track each part of your body???? can Kinect distinguish each part and give it a tag?
thank you
waiting for generous reply
I don’t know whether external camera can be connected to the tablet, I haven’t tried to do so.
And yes, Kinect can distinguish different parts of your body, I mean track your body parts/joints, its position and orientation.
embt..
can you help me ?
why my app won’t show my 3D model when i run it
thanks
Hello, I would re-watch closely the video. Most likely you forgot to put your model in ImageTarget? Maybe some errors shows up?
still can’t respon
maybe you can help me with my project ?
Maybe, who knows 🙂
hola soy de peru . he vist o los tutoriales de unity3d sitio web y los scripts no son correctos .. gracias admin por la ayuda que nos brindas en tus tutoriales por que los ejemplos de unity web sites no me resultan nada
Hello, could you write it in English?
“hello i’m from peru. i visit the tutorials of unity3d on the website and the scripts is not the right… thanks admin for the help that you gave to us on your tutorials because the unity’s examples in the website not help me.”
-Something like that!, sorry for my bad english too!
So you’re saying my scripts are better than Unity3D scripts? :)))
sir can i ask about the details of this project? i need some help for my school project..i dont what to buy and how much it will cost..
It’s everything on the website, nothing more.
I’m amazed, I have to admit. Rarely do I encounter a blog
that’s both educative and entertaining, and without a doubt, you’ve hit the nail on the head.
The issue is something which too few men and women are speaking intelligently
about. I’m very happy that I came across this in my hunt for something regarding this.
Thanks
Hi, I have followed your tutorial and everything works fine in play mode but I cant build project on android.
Worth to mention that I configured everything like you show on money tutorial and that project works fine … can u help in this matter or is text rec not working on android ?
Tkanks in advance.
Bart
Hi, text recognition is working, but video display is programmed not i a proper way that is needed for Android devices.
it’s possible to add voice recognition to this Character
I believe everything is possible. Have I done it? not yet.
sir thankyou for helping me can you please tell where will the led be put
-regards meheer shukla
it’s all in the code:
#define redLed 7
#define greenLed 6
#define blueLed 5
May I ask, what if there is two model in 1 scene? Because your tutorial only have 1 model in 1 scene. Then, what happened to the tag? Can I tag both of my model as Model?
Sure, why not, if you want to apply the same function.
I have tried set both character same tag name as Model, but it does not work.
What happen is only 1 character that scale up and scale down when I click the button, the other one did not happen anything.
I’m creating a scene that has two character, one person performing CPR and the other person is the patient. I need both character to scale up and scale down at the same time when the button is clicked.
Firstly try it out on 2 simple cubes (tag it), try to scale it and tell me the result.
I tried too make much object but only 1 object can rotate, scale etc. Please help me.
Hello, Your work is so amazing! Can you provide a copy of the source code for me?
At the moment it is not shareable.
Admin you my teacher and motivator…. i will donate you for sure….
Thanks for good will.
can you help with tutorial or any information about: User Defined Target(vuforia). ? thanks for advice
Soon it will be available.
Can I use External Web Cam???
If Yes then How??
For tracking Human body? No, not in this case.
I tried your Money augmented tutorial no:28!! that was amazing!!
How can i use user interface in android app?
The same way as in this tutorial.
may i know, where does the 3d object skull and iron man come from… can we call another object to be augment on top of the head?
https://docs.google.com/uc?authuser=0&id=0BygvzTqnzm_wTXV2UHlWN2NMXzg&export=download extract, data folder and yes you can.
my camera not start kinect install properly but dose not start after play it`s totally blank hot to solve this problem
I would start from the begging. Firstly test it out whether your Kinect is working properly on PC? Maybe some other sample codes/apps?
It is cool. Great work.
how to create a 3d object of my choice?
How to create or how to use already created models?
You can create models using 3Ds Max, Maya, Blender, SolidWorks and lots of other 3D modelling tools.
How to use it? you should put the model in data folder and change some Processing code (you will find out if you watch closely).
unity collider trigger
Nope, actually no colliders used here.
Thanks for the tutorial, you could explain how the process of extracting the 3d models of monsters yugioh game?
I haven’t extracted models from the yu-gi-oh game.
How do I reset the animation if the target has been detected? It seems that the Animation just pause/continue playing if The target has not been detected.
Yeah, for this you will need to code a little bit, just google how to stop animation in Unity3D.
android ultimate plugin lite seems not free but paid, you got a free asset android ultimate plugin lite for unity3d thanks
Ok, is there any question hidden to what you wrote?
A big big big thanks
hi, great tutorial, so thank you, you help me a lot learning AR, can you help me with one question when i installed it says ok the moneyar.apk but when i open it, the screen get black, and nothing happend 🙁
ok thanks!! Can u tell me from where did you get the iron chest model?
On the wide internet, my friend, google it, can’t remember the exact website.
hi, the last questions, i get resolved, it was my android version, i test with another one and now it is working. 🙂
Another excellent tutorial, android ultimate plugin now cost 5$ 🙁
Oh well, it’s not so much, but maybe you’ll find something for free, but you’ll need to modify some code.
Great Tutorial thanks, but i want to ask question i developing a vuforia app,
but when i taking a screenshot using your code my result is only white screen and the Augmented view only with white background can you help me pls?
sorry for my bad english, thanks
I have a better “taking a picture” function, but I am not willing to share right now.
I cannot find the file [patt.hiro]
I want to change the marker file.
Do you know where the file is?
in nyar4psg library folder
Hi, I did the same way as u did and everything worked. Just that the UI buttons will be on screen even when the image is not tracked. The UI Buttons would just stay at the last place it was tracked. I wanted to do when the image is tracked the buttons appear where the should and when is not tracked it would disappear. Please help. Thanks in advance.
Hello. Put your Canvas (with buttons) inside Image Target.
Hi first thank you for your job ! sure i have the same problem i did everything but the UI buttons are on the screen doesnt matter, if canvas are in or out ImageTarget the result has same the ui button are always on screen.
Please help. Thanks you.
I have the same problem as Nqb..I have tried making canvas as child of Image Target. But still canvas is rendered onto the screen even if image is not tracked. Can someone please help me out?..Thanks in advance
Best tutorial..
Can you create tutorial about interactivity hand kinect with button like virtual dressing room?
I’m not planning to
Hi, i had printed the marker but its not working, can you please tell the dimension of your marker?
great videos btw
8×8 cm, thanks!
thank you soooooo much
how to use this with arduino uno
The same way as nano, just connect to a proper pins, which is also written in the code itself.
I have a warning!!!
WARNING: Category ” in library UIPEthernet is not valid. Setting to ‘Uncategorized’
what’s your full path to the library?
thanks for this tutorials , But when i trying to copy your CS file to the Unity , this error appear , did you know why ?
http://www.uppic.com/uploads/14531513431.jpg
I would guess that plugin was imported wrong, unless you downloaded whole project folder?
Hey could you suggest any tutorials for using real world marker input instead of virtual buttons? The plan is to make an application which reads the position of a real world marker and responds based on it’s hovering over a real world button which is printed on the paper page instead of virtual buttons.
I don’t have such tutorial in my list. But I would know how to code it.
can i use dress for body to that iron man ?
Why not?
if i place several game objects on 1 marker, can i scale/rotate/move them individually?
You can.
Can you please tell how should i move them individually???
– Please guide!!!!!!!!
Can you please tell me where I can find the full algorithm for the code?
Have you watched the tutorial from start to end? It’s all there.
why am I getting this error
an error occurred while trying to enable vuforia play mode
I believe it was one time error? no? and happens from time to time?
hi. i am from turkey. please help 🙁 i need this code with lcd . how can write 🙁
So try to combine it, there is also a lcd tutorial.
Can you make a video tut for play video in place of animation. like if we the target located just play video.
I already have the content gathered I only need to film it. I’m planning to do so on February and I will upload it on youtube.
I got same problem with Nqb, and i already put the canvas with buttons inside image target, but the result still same, the button still popped out when the marker is lost
Can you get this to work with an Arduino Uno or Mega by any chance, an what changes would need to be made to the code.
Whats that device youre using?
Leap Motion Controller
hello, thx for the tutorial its helpful 🙂
as far as i understand the primary surface used to track the scene which could be as the size of a dinning table, what if there is 5 image targets on 5 different places ( not far away), would that extend the size?
example: lets say i did put 4 image targets at the edges of a table, would they all be tracked at the same time and the “props” would be the same for both? , or each target would define its own scene and props?
hopefully u can understand what i mean 😀
This is something that you will have to test on your own, but I would say each target would have separate “props” not the same. This is my logical guess.
why when i import on smartphone the canvas doesn’t appear?
On PC works everything perfectly?
yes man, on pc i can see the button and the panel, on phone it seems like they’re invisibile, when i tap randomly on the screen tapping them, they work but i can’t see them!
Hard to tell, I would need to look into project
how can i send you? mail?
Sorry, I won’t have time to deal with it.
Hi, I downloaded your project, and when i exported it for android I can’t see the button on smartphone, but when i download your apk,works, what’s the problem?
I have restested just right now (exported the apk). Works great actually.
How I can detect white color?
why the values are not same each time though color object is same from same distance?
Hi! Thanks in advance for your work! I tested and worked very well.
I’m looking at the code and I have some stupid question to ask you. I would be glad your feedback. (sorry in advance for the dumb questions.. but I’m not an expert in arduino)
0.1- the raw output of the acc is what? voltages?(from readAccel)
0.2-the raw output of the MM is what?
0.3-the raw output of the gy is what?
1-line47: reading gyros acc and mm, you have a FOR loop of 201. May I ask you why?
2-line88: why 255? there is a pre-set offset of 255deg?
3-line89: why are you dividing by 256 the raw data?
4-line92->95: I don’t understand what u are doing here.
5-line113->115: why is divided by 14.375?
I was looking at you comment in the 15-05-2015. It is interesting: Kalman filter doesnt improve the attitude determination with quaternions. may I disagree? the kalman filter (if well tuned) would improve drastically the representation of the attitude during time, enabling what is so called “smooth” representation.
Here (http://www.himix.lt/?p=915) are you using just quaternions? no KF? right?
Have you ever tried to implement it on arduino uno?
I had rumors that is impossible due to limited memory?
Thanks in advance for your kind answers,
I really appreciated your wonderful job! It works nice?
p.s. do you have an oscilloscope for dumb Macosx users?
Cheers
S
Hello,
1. Concerning all first questions – look up some theory on the internet how it works and read sensor datasheets it will answer lots of your questions.
2. “I was looking at you comment in the 15-05-2015. It is interesting: Kalman filter doesnt improve the attitude determination with quaternions. may I disagree? the kalman filter (if well tuned) would improve drastically the representation of the attitude during time, enabling what is so called “smooth” representation.” – show me some proofs of that “drastical” improvement. Quaternions already have smooth representation.
3. “Here (http://www.himix.lt/?p=915) are you using just quaternions? no KF? right?” – correct
4. “Have you ever tried to implement it on arduino uno?” – Yes I have tried it.
5. “I had rumors that is impossible due to limited memory?” – Wrong, there’s enough memory.
6. No, I don’t have oscilloscope for Mac.
Good luck with your work
Stu figlje e bucchin. thanks 🙂
Sorry?
I am ready to buy… please how can I get code for MF522 and NC door lock latch and Adriano? send details to my email address.
I’m not selling anything.
I did everything perfectly but when I rotate the object, it does not rotate on the Y axis but it makes a combination that continuously sends it down making it impossible to orient. Why? How can I fix it?
Help me please! I’ve already tried some files for processing on the kinect, like this one: https://github.com/shiffman/OpenKinect-for-Processing/blob/master/OpenKinect-Processing/examples/Kinect_v1/RGBDepthTest/RGBDepthTest.pde using the model 1414 and they worked perfectly. But for this project i have installed the libraries and run the processing file and it just opens and freezes in the gray window, without showing any real time images. It doesn’t seem to show any errors. What do you think it is?
I know this might a stupid question but I noticed when Opening your Project that I downloaded In this site unity immediately open the “Game” tab and the “Scene” tab is missing I was wandering on how did you do that. Thank you so much on this tutorial reaLly learned a Lot on this experience.
This was not my purpose 🙂
Hello, it’s greate. Can i bay some controllers?
Hi. It’s not for sale yet.
when i try to start the camera is freezing. any suggestion? i am using notebook.
Without any errors?
Thank you for your work, your tutorials are very helpful!
I was wondering if it could be possible to show the panel only when the marker is tracked?
It is really possible, but I won’t go into details, google unity/vuforia forums.
Thanks, I’ve managed to find a solution on these forums! But I’m now facing another issue as I have multiple targets. It works great, until I click the camera button and track another target : the share button from the previous track still appear… Is there a way to restart/disable your script while on OnTrackingLost?
I ended up duplicating your script and calling the matching canvas for each ImageTarget. I don’t know if it’s the best thing to do but it is working! Sorry for the bother and thanks again for your tutorials!
Hi admin!
After tests on several devices, I am facing few troubles on a tablet using Android 4.4.2 :
– If I take a screenshot of an ImageTarget and share it right away, my app restart.
– If I take a screenshot of ImageTarget A without sharing it afterwards and take another one of ImageTarget B right after my app close.
It is working great on smartphones using Android 4.2.2 and 5.1.1 though, any idea what the problem would be?
Hi, I know about the restart issue and I am not sure why does the sharing function causes it.
Hi,
I downloaded the script file and loaded it directly in my scene. It didn’t work. Both the buttons and models appear as soon as i enter the play mode.
Later i also tried changing the names of buttons and models according to what I have named them on my scene. It now shows shows me an error to fix the compiler.
Could you please guide what all attributes are to be changed before loading the script.
Thanks.
Everything is shown in the video. You can download whole project file and test it out first.
Hi, your tutorials are very helpful great job! Got one question is it possible to take a snapshot with interface graphic elements? In my case the snapshot is working but without augmented layer. I’m working on simple app with OpenCV ForUnity. Maybe I have to change the camera name in your script?
Thank you and pleas keep your tutorials coming!
It’s possible with or without, doesn’t matter. Just dive into code, I’m hiding the UI elements.
Hi,
I run this code with GY-85 BMP085 sensor. ADXL345 library is in black instead of orange in the program. Also, at the serial monitor, the values does not change and they are always “0.00, 0.00, 92.50”. I don’t understand why it is. I need help 🙁
Did you sort the problem? Maybe some wiring problem?
I actually know what your problem is. On my code, ADXL345.h is also black, but it runs just fine. I get the same problem if I move the libraries to a wrong location.
So to explain, When my code works correctly, I have my main folder labeled “Arduino”. Within that, I have a unique folder for each “.ino” file, labeled the same way as the “.ino” file (minus the .ino) and a folder labeled “libraries”. All of the libraries go in the “libraries” folder, then are saved in another folder titled with the name of that library followed by “_library”. For example, it goes:
Arduino>libraries>ADXL345_library>”all contents of that library”
I have the problem where my serial monitor values are always “0.00, 0.00, 92.48” if I move the libraries from the “ADXL345_library” folder to the “libraries” folder.
I don’t know if that actually makes a difference but if it was the same problem so hopefully this helps you fix it!
Can you please tell me how to remove the axes while showing the output??
Do you need it anymore?
Your tutorials are great!
I learned so much from them. I watched almost all your AR tutorials and executed all the projects!
I had a lot of fun watching and learning.
Thanks a lot and keep up the good work.
Please post more tutorials 🙂
Best thanks – donation :))
Hi, No voice?
Hi, no, my throat hurts, it hurts in every single tutorial (rofl).
[…] http://www.himix.lt/augmented-reality/ (lots of AR videos here, not just Vuforia/Unity) […]
[…] http://www.himix.lt/augmented-reality/augmented-reality-using-unity3d-vuforia/ (first of several videos on Vuforia and Unity 3D) […]
good!
Hello Edgar! Great tutorials!
Have you tried to play with physics?
I’m trying to roll a ball when tilting the marker, but have no success 🙁
Maybe you can suggest something?
Thanks!
A little bit, but I just suggest you surf the web for these things with Unity physics.
can we have the code and which sensor is used?
I can sell it for you.
It’s top secret.
It’s an awesome tutorial,but i was undering if it is posible to add adition text to user’s one, for example: he/she wrote: I like this games,and in the end an stabil #CompanyName?
Hallo dear
can I know how I can reflect the car on that paper is there any device doing that please
thank you
Please follow the Augmented Reality Tutorial 14 and instead of character import car from the asset store. Devices: Smartphones/Tablets/PC’s.
can I know how to lock the rotation to only rotate in z position? any tips for this?
I believe in you. I believe you will find a way by yourself.
i do not know coding very well. so i really dono where to change the code. can u tell me?
does the video when we scan has backsound?
What do you mean by backsound?
Thanks! I was searching for a tutorial like this!
Only after this half year, a lot has been changed to unity and Vuforia. I can’t follow your video any more. So sad.
Actually nothing has changed, except newer versions of Unity and Vuforia. The steps to achieve AR example stays the same.
Video with a new marker can not be played. The video with the marker provided can be played. Do I need to upgrade to Unity Pro to play the video? Thanks in advance. 🙂
No need for Unity3D Pro license, really, it’s possible to make it on your own target just some attention is needed. Did you follow my steps?
Can the marker be anything of my own choice ? Like any image I want ?
Yes, if you start follow from Augmented Reality Tutorial No. 14.
but thats using unity3d and vuforia. So does nyar4psg has this limitation of predefined marker file ?
With nyar4psg you will be able to track only square black markers, of course, you can make it your own, but nothing alike images. Such marker-based tracking won’t be so robust.
I did everything step by step but my videos wont play.
It shows up but it gives me the x image and when i click on it it gives me the loading image forever.
How do i get my video to actuall play?
actually play,
sorry
ok
Do you have any idea whats wrong?
is there a way to do a multi-3D-object tracking?
I would say no, it’s too hard to track even one 3D object.
Where did you get the iron man skin?
Googled it out.
Hello, is this project deployed on desktop version (windows or mac)? if not, do you have any idea how could be this possible? Thanks.
I haven’t deployed it anywhere, just tested it out in Unity Play Mode. I’m not sure whether some workaround would work on this project.
Hey i tried doing the same…. but my Unity Crashed when i am trying to add ImageTarget. I am using Unity-5.3.3f personal. Can You tell the version of unity you using. So i can follow your video’s.
Unity 5.2.4f 32 bit, Vuforia newest version.
thanks for your tutorials
but i want to ask you…
is there any way or tutorials for creating 3D objects or coverting 2D images into 3D ?
Of course there are, but I’m not making such. Example: http://www.123dapp.com/create
Hi
I’m studying in IT major, Could i see your code please? this VDO
https://www.youtube.com/watch?v=DXLyBQTS5-w
Help me please
Hi,
What do you mean “could i see your code please”? It’s not my video.
i want to know this, could you teach me please!!
No.
Why my video and image appears inverted ? How can I fix that ?
Struggling with this also, changing orientation by – scale does not work.
The video in preview is correct, but when playing is inverted D:
Aha fixed it, select the “Video” prefab, change Transform: Scale from +(number) to -(number), it inverts the image.
Example:
Scale X 0.1 Y 0.1 Z -0.1 <—
Doesn’t work for me. I tried -0.1 and 0.1 but nothing change.
try to invert X axis X 0.1 Y -0.1 Z 0.1 (it worked for me)
Sorry, it is
X -0.1 Y 0.1 Z 0.1
My bad i forgot to actually export it into a android project
Now it works
For any one reading the comments
THE VIDEO WON’T PLAY IN THE EDITOR YOU HAVE TO EXPORT IT OR IT WONT’ WORK
.fbx HELP
my video isnt that long, how can I change the script so it loops?
I’ll leave it to you to solve it.
I just want to say you are a beast, really appreciate the effort you put in.
Hi, is mpu6050 same as itg3200 because when I seperately search them, i see the same pictures.
MPU is like 2 sensors in one (accelerometer and gyroscope), but I would say not the same.
I quite like reading through a post that can make men and women think.
Also, thanks for allowing for me to comment!
sir can u change the tracker with user defined?
Hai Admin,,,
I try processing in Ubuntu but I don’t know how to import library Nyar4PSG,, I try to create and copy in ~/Documents/Processing/libraries but it doesn’t work correctly
I’m not sure about the Ubuntu… Processing version 2.2.1?
Yes , I try processing version 2.2.1 and 3.0.2
Use 2.2.1. What errors do you receive?
Thank you for the tutorial! I am working on a app that is going to be using Text Reco and Cloud Reco. I have a couple questions that I am hoping you could answer. For starters When I run it on unity I the space that can actually read the text is really small and not that forgiving whenever I move the text. I was wondering if you knew of a way to make where it reads the text larger/ more forgiving when the target/Phone moves? Also I was wondering If you knew anything about cloud recognition, I tried using the vuforia tutorials but they are out of date and no longer work and the newest tutorial I can’t seem to figure out either. I’m assuming i’m messing up in some sort of way because when I look online nobody else seems to struggle. Any input would help! especially with the cloud Recognition if you can, thanks!
Don’t know about the text stuff, but I will make cloud recognition tutorial pretty soon.
Awesome! How soon should I be looking for the tutorial?
can you please post the code for MPU6050. I stuck with my project and need some help
can we use webcam instead of kinect?
No.
Wow…Very Nice tutorial..Thanks
Yeah, sure, your welcome and my paypal account is… :))))
Hi,
PLEASE HELP ME…
I am facing issues creating virtual buttons. I’ve followed this video. I am mentioning one video below to illustrate my issues..
Link : https://www.youtube.com/watch?v=mxD0PiQ28_o
Note:
1. Virtual buttons is not working unless i am focus on to the button.
2. With out touching button it changing model base on my camera movements .
3. I changed max simultaneous tracked images 1 to 4 (each separate build in my Android mobile) .
4. Virtual buttons Sensitivity setting also changed from HIGH to LOW (each separate build in my Android mobile).
If you want i will send my Unity package file link also.
Thanks
ESWAR
Email: eswarkumar.borra@gmail.com
Hi. Try to put virtual buttons on textured place on the image target not on white spaces.
still not working….:(
Hi,
What steps need to be perform if we want a video playback on cylinder target ? I want to see a video, on cylinder like object instead of flat image marker.
Thank you,
Nik.
Combine the current project and this one: http://www.ourtechart.com/augmented-reality/augmented-reality-video-playback/
I have already achieved to display a video on Image target.
In Image target case, we upload our marker image to developer portal database, but for this case assume that image marker is a sticker which is attached to bottle. I want to see video as I scan the sticker.
So, shall I upload that image target as a cylinder target image in developer portal database. ?
And what would be hierarchy inside unity project ?
In case of video playback on target image:
– ImageTargetStones (Parent) contains ImageTargetBehaviour.cs
– Video (Child of ImageTargetStones) contains VideoPlaybackBehaviour.cs
What would be hierarchy for diaplaying video on cylinder ?
Works well. Thanks so much
Hi !
I did everything like you with Unity 32bit but when I click on start and show the target in front of my webcam, the 3D model doesn’t appear in AR..
Can you help me ?
Thank you !
I would guess that you didn’t checked all the needed checkbox’es in ARCamera or didn’t selected tracker image in ImageTarget.
Thank you !!! 🙂
I did everything as per tutorial. However, after pressing play my pc shows a black screen instead of the webcam display
Unity 32 bit?
No 64 bit. Will try 32 bit thanks.
hello,
thank you for this tutorial.
i try to this video, but i have a problem.
Assets/script/SnapshotShare.cs(7,17): error CS0246: The type or namespace name `AndroidUltimatePluginController’ could not be found. Are you missing a using directive or an assembly reference?
this error appear , did you know why ?
Have you installed plugin?
of course! i can’t find ‘class AndroidUltimatePluginController’
Should i change class name? where?
did you bought it?
hi.. i just bought the plug in. But smhow i faced the same problem with sh. (i am new in this)
The same problem here.
Hey.
nice tutorials. really helped.
one doubt though. What is the basic difference between markerless and marker based AR. I tried searching it but I’m still confused. In this case if we are adding the image beforehand then how is this markerless AR?
would really help if you solve my doubt
In marker-based tracking we track only black square markers. In markerless solutions we can track image targets, faces, hands, fingers, finger alike objects, bodies, etc.
Hi is Art tool kit support to build a exe in unity 3D
I haven’t used Artoolkit.
Great…I need this tutorial thanks a lot.
Can you give more tutorial to rotate the car with button left and right ?
i already make the udt but i can’t rotate the object.
Thanks :)))
it’s done here: https://www.ourtechart.com/augmented-reality/augmented-reality-user-interface-unity3d/
thankyou…
i tried to make it, but it’s always looping when i press the button.
is there any fix for this ?
uncheck loop box in AudioSource inspector menu
thank you, you’re awesome :))))
Hey i downloaded ur APK for testing it and when i click on the video, it loads for ever, any idea of what im doing wrong or anything?
Please rebuild the apk file using provided project files. It should work.
hello sir, this tutorial is very nice..
but i want ask the c# code in appcontent..
did you has update the new c# code in appcontent
I didn’t need to. Why do you ask?
Hi, I still don’t know how to install distributed library into the program by using nyar4psg, I have google a lot but I couldn’t find out any thing, could you show me how?, thanks.
When i click the play button my webcam not open. what should i do ?
What error do you receive?
How to play video continuously when i am taking camera focus from target. Thanks in advance.
Hi Kiran,
Did you find out how to play video continuously when target is moved out of camera’s focus? We would like ideally to use Vuforia to only trigger video player, so it comes out of the image and turning/moving towards the screen finally getting into the place. Once it’s in the place we can touch play button for video to play in the full screen mode. Also would be nice to close finished video and return to the targeting mode to trigger another video from different image. Any help would be greatly appreciated. TIA.
Only video preview in full screen mode would not depend on the tracking state.
About other needs – there is no easy description how to do so, you just need to code, but I don’t think you’ll be able to have some additional buttons (from your side) when the video is in full screen.
Thank you, will try it.
why cant i find the appmanager??
Hi Edgaras art,
thanks a lot for the tutorial. I’m having the same webcam problem, where it seems like you need 32 bit version, on the latest unity version there is no 32 bit version what I could do?
Get it from here: https://unity3d.com/get-unity/download/archive
Hi, I followed your tutorial and worked perfect! than you so much for providing this kind of knowledge, I really appreciate it.
What I would like to ask you, is how can I get the resulted .apk being as light as possible? Could you give me some tips on that?
I assume you only use videos in the app. So you could stream it from cloud.
How can I do that? (Streaming from cloud.)
Hi … wonderful tutorial. However, I followed all of the steps you explained but the camera can not detect the 3D object even on the textured sheet. Can you please help me with that?
Hey, i have the same issue. I followed the guide and implemented the app on an nexus 7, but the object can not be detected. I did not do any modifications on the code, so i don’t get it why it is not working.
One of Hello NEED presetacion like this https://www.youtube.com/watch?v=Kb99oD7FVgA as is the value
What exactly do you need?
We need that same application with the same animals
hi, i followed your tutorial, and i couldn’t find AppManger.cs and SceneViewManager.cs in /Asset of Unity.
could you tell me how and where can i find it.
Nice tutorial.but..
Can I. Request AR FPS tutorial??
What’s FPS?
I know this is late but I think he means like in the show where the monsters are in huge scale and standing right infront of the player type of thing.
J espère que ce projet sera possible sur smartphone et vous créez tout les monstre de yu gi oh car sur l application ( androdisc ) il a été seulement 60 monstre et j espère de savoir comment je peux construire ce demo et merci
Can you write it in English? Thanks.
Hey please can you tell be how can i put my own video instead of the video in appcontent file. Or how do you convert the video to its meta files.
The same way I did in the video, you don’t need to convert to meta files.
So basically you mean i just have to put my video in the appcontent folder instead of your augmented_reality_technology?
By doing this can my video play instead of the video provided by you?
Pretty much, don’t forget to add file extension *.m4v.
I tried it but when i play it through mobile the moment i click on the screen to play the screen goes black…any specific mistake that i am doing ? can you please tell me?
I have tried your tutorial but when I moved the object away from the camera, the interface stays on the screen but in angled position. How do you fixed this? Is this something to do with the script? Please let me know, thanks.
Same here! Would love to know how to fix it. So It can only pop up when you point towards the track. Already tried putting the Canvas inside Image Track, and it does not work. Thanks for the tutos!
thank you for your tuorial, it’s very helpful
i followed your tutorial, but i i’ve an error like this :
Error attempting to SetHeadsetPresent
UnityEngine.Debug:LogError(Object)
Vuforia.VuforiaAbstractBehaviour:SetHeadsetPresent(String)
Vuforia.VuforiaAbstractBehaviour:Start()
how to fix it ?
Hello , i download the source code and tried running it on my android device . However , onlly 2d ground image is being displayed on tageting at image target . Any clue why that might be happening . Please help soon as possible . Thanks
Hi Edgatas Art,
First of all thanks for the tutorial series. I have a question that in this case I think that although we are moving the tracking image by our hands but it remains stationary in Unity scene and it seems that the AR Camera is moving.
What I want is as I move the tracking image by my hands. I want the 3D object placed on the image to move along with it in the 3d space.
Please help me to figure it out
I really don’t get it what you want to do, as it already does what you just described.
Can I have the code and the name of the sensor too?
I have MPU-9255 sensor
where I can get this application
Hey man, i’ve tried this tutorial and it works, but now i got a problem, the warning says “trackable userdefine lost” and the object doesn’t show up when i click the button.
can you tell me how to fix this.
Thanks
Hello Edgaras,
thank you very much for your tutorials.
I tried this with my own video and it works perfectly.
I also changed the orientation of the video by Selecting VIDEO in Hierarchy and changing the X Scale value from 0.1 to -0.1
I have a problem when I pause the video and play it again: the music start from beginning but the video remain blocked.
Where is the problem? Maybe because I stream an MP4 video instead of M4V?
Thank you very much
Marco
I am Getting this error
The type or namespace name `MovieTexture’ could not be found. Are you missing a using directive or an assembly reference?
Same as here. Im basically new, so I dont understand how the script works. Though when I try playing, it works but, there that error. I cant build it.
cool
Are you using an standalone app for windows? so, can you tell me how did you do it?
Thanks.
No I don’t.
as I export the application to be displayed on the big screen
hey man I left comment on the old website too
I wanna know if this going to work on Uno ?
Uno?
I meant Arduino Uno. I can see that you used Arduino Nano.
It will work the same way.
awesome! how can I put request in that code ?
you can make much magic for game. we can study on your website. hehe
Study my friend, study.. And when one of your apps will make millions, don’t forget to drop a few bucks on my name 😉
I am Chinese in china, Use VPN to browse your site O(∩_∩)O Thanks
No problem.
hey man should I use to old website for request ?
The next step should be fired bullets
Haha Kill zombies
cool
Nice 🙂
hello as I compile the application so that you can see on a big screen
I wonder how to move this kind of app to mobile phone and create control system on it? Thank you!
In a similar way I did it here with buttons:
https://www.ourtechart.com/augmented-reality/tutorial/augmented-reality-user-interface-unity3d/
and here with exporting the app:
https://www.ourtechart.com/augmented-reality/tutorial/augmented-reality-android-app-export/
How about jump control. ? It just for movement control. I’m so confused. .
Sign me up! We are in awe with this AMAZING project! We are ready to get our hands & minds on it!
First day of Summer Break, notification popped up for newest demo…my boys dropped EVERYTHING to hover & watch!
Love it!!
What is this for?
please tutorial part 3¡¡¡ you are awesome bro¡¡
Does this work with android?
Yes.
Hello
I am a fan of your page
In Tutorial No. 39 you put some jpg images as example .
How do paragraph colcoar OTHER jpg images is no unity ?
Put some tried but HE DID NOT accepted
Grateful
great tutorial but how can test on iphone
hi, I think you could just build .apk file in unity and import it to your device, and install manually. it should work
cool i like
when i import videoplayback package, i got these error:
Assets/Common/MenuOptions.cs(10,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
Assets/Common/SplashAbout/AsyncSceneLoader.cs(7,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
Assets/Common/SplashAbout/LoadingScreen.cs(10,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
how to solve it?
the score between the two avatars is real or fictional? if it is real as is done?
Real.
https://www.ourtechart.com/augmented-reality/tutorial/augmented-reality-eye-tracking/ I want to learn this tutorial,thanks
can I use simple webcam or kinetic cams??
Very good job !
WYSWYG : What you see in this tuto is what you get in download
Thanks
I’m just building the unity 3D project on my Samsung and the panel and buttons are not appearing. They appear when I test it using UNITY 3D but not on my phone.
Do you have any idea of what could it be?
Eres genial, tu contenido es digno de una clase de maestria, este juego esta demasiado bien y tiene mucho potencial de diversas maneras, pero creo que te hace falta el manejo de marketing digital. si necesitas ayuda con eso yo se un poco jejeje, espero que sigas haciendo este tipo de contenidos y espero que tus proyectos sean un exito.
Saludos desde Colombia.
How to make a model?? thanks
Could u show me how to make it? It’s so great
How can do that for a static 3d character, no animation??
If 3d model are rigged then you can do that with static model (without animation). Head move together with head bone. For all model (parent gameobject) left/right rotation just use one of the rotation methods (RotateAround, eulerAngles). 🙂
please I’m student and starting my GP i need help i wanna know how to start with augmented app with android devices i wanna use android studio , step by step my idea is face tracking too
can u plz help me ?
I have the same problem as:
” I did everything step by step but my videos wont play. It shows up but it gives me the x image and when i click on it it gives me the loading image forever” also I can’t find this file “AppManger.cs”. Any idea I use the latest unity and Vuforia plugins
I noticed there were questions on the videos being inverted upon tests. Mine is doing the same. I have tried all suggested. Can any help regarding where the proper axis change is made?
Current setting for ImageTarget is: X -0.1 Y 0.1 Z 0.1
Thanks in advance.
tutorial please.
good idea! learn from U! thx!
hi, thanks for your tutorials. These tutorials are great help for beginners.
I’m facing a small problem please guide me through, when i press arrow keys player animate perfectly and rotate also but didn’t move physically on plane, animate only on fix point.. Thanks in advance 🙂
Your website is awesome. I discovered it like several months ago, but always thought that this requirement of having a target image is somewhat cumbersome. Thank you very much, sir!
I ‘ve tried to make video playback like this. but the unity said that “IsampleAppUIEventHandler” cannot be found. it’s because I dont have that file in my project. so where I can get that file ???
hi
In your opinion, which one is better for a good quality in AR project?
KUDAN or VUFORIA.
Both 🙂
Vuforia can achieve it?
No, Vuforia at the moment don’t have SLAM.
Thank you very much for your sharing!
Hello, i`m from Brazil Manaus,
FIRST congratulate BY ITS TUTORIALS this note 1000, I’ve been doing this now put in my Unity 32 and 64-bit generate an error when starting the camera , already put the api key editor and still generate an error with the name.
DllNotFoundException : KudanPlugin
Kudan.AR.TrackerWindows.StopInput ( ) (at Assets / KudanAR / Scripts / Classes / TrackerWindows.cs : 96)
Kudan.AR.KudanTracker.OnDestroy ( ) (at Assets / KudanAR / Scripts / Components / KudanTracker.cs : 438 )
if possible send a help in the email would be very grateful
hugs
switch to android or ios.
excuse me, I don’t know why my bullet will fly disorderly when I change the AR camera(use the card let virtual thing appear) and image target.
excuse me,can i ask where is “virtual button event handler” because it’s not in the vuforia scripts.
wow,so cool.
I only know the shooting scenario in unity ,but really want to know how to make this game.
I FOLLOWED YOUR TUTORIAL.ITS VERY EXCELLENT. BUT IAM NOT ABLE TO CONTROL THE ANIMATION.IN GAMW VIEW ITS VERY LARGE.CAN YOU PLEASE EXPLAIN HOW TO CONTROL THE ANIMATION?
Keyboard keys left/right/forward/backward.
Where can we get the “Change Motion” Button? That’s a whole different story isn’t it?
sir i have an issue pleease help me my target image is not showing in unity3d please help me……..it show white in unity
Hello,
Thanks a lot for such useful and detail instructions! I’m just starting exploring how to create AR with Vuforia and Unity. And these tutorials definitely come in handy 🙂
I tried to follow this tutorial. But unfortunately there no such property for a button (like in your video 7:11). Here’s a screenshot what I see: http://prntscr.com/c5cqi4 . There is no init() function. I tried to use start() instead but it didn’t generate that 2nd script where you change some code (from private function to public).
I’m using Unity 5.4.0 and Vuforia 6 (tested on v5 as well).
Can you please explain me what I’m doing wrong and how to fix it? Thank you so much in advance! Hope you’ll find time to answer.
Keep up doing awesome things! 😉
Best regards,
Alex
hello this video I likt it how to make it how to study I come from china
How do I run on 64bit unity?
Vuforia works only on 32 bit unity.
thanks!
This video is so amazing!
I want to know how to do that. Can you send me source code or tutorial of this project.
Thank you!
https://drive.google.com/open?id=0BygvzTqnzm_wblduNnVSdmllZ00
I received it. Thank you so much!
Hello! Can you explain me please, how can i put this code into my 3d objects to make them transform?
hello guys
great work you,
excuse me, when the truth will raise the tutorial interested me too , and I want to learn more about the RA
atte
Fredy
It really helped me! you have such a great material thanks!
can you help me
Failed to load ‘Assets/KudanAR/Plugins/x86_64/KudanPlugin.dll’ with error ‘操作成功完成。
‘, GetDllDirectory returned ”. If GetDllDirectory returned non empty path, check that you’re using SetDirectoryDll correctly.
Kudan.AR.KudanTracker:GetPlugin() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:203)
Kudan.AR.KudanTracker:GetPlugin() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:196)
Kudan.AR.KudanTracker:Start() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:220)
Unity 64 bit? switched to Android platform?
yes , I switched to Android
Did anyone manage to resolve the above problem
why not code ? can you share with us
hi, may i know how caracter look each other? are you using LookAt in unity or what? becouse i want my caracter look each other , but still not found how.
Correct, LookAt.
Hi There, I am playing around with this and am wanting to have 5 pages instead of 3. For some reason when I add two more pages, the swipeimage script seems to malfunction, not allowing me to swipe at all. Any thoughts? I adjusted all the parameters I could think of to account for the new pages but I didn’t mess with the script at all. Would it need modification? It didn’t seem like it should…
hey, thanks for the tutorial…but the share button does nothing and all the other buttons work. I bought the plugin and followed the tutorial, is there a permission I should be adding or something has changed?
I have followed the instruction as above. However the plane could not automatically disappear unless I clicked it. After I click to disappear the plane, the cube or sphere is not appear. Please give some advice.
I am using Unity 5.2 and vuforia SDK 5.5.9 .
Could you tell me what the computational cost of this application cosiderando it running on a smartphone android
Hi Edgaras, I’m very interested your ar technique like in video, if possible could you make a tutorial or share some information that i can lookup about this.
I also in area of unity could you tell me how that you convert 2d coloring texture to map on the 3d model. please!
thank you!
Hi Edgaras, I’m very interested your ar technique like in video, if possible could you make a tutorial or share some information that i can lookup about this.
I also in area of unity could you tell me how that you convert 2d coloring texture to map on the 3d model. please!
thank you!
Hello i need some help about virtual buttons
can you demonstrate how to make a virtual buttons do rotate and and move objects
Hi!!
This plugin works with ios ??
No. I already tried. It doesn’t work on iOS.
you are awesome, nice tutorial.
When i scan a plan using camera and loading bar, model is getting loaded in another plane. Is there anything i would have missed or got messed up? I followed your tutorial clearly!
can you help me for making project
Cloud Recognition even you will optain money from me
Hi Edgaras,
I follow your tutorials and they are great.
I have a problem.
I am using unity and vuforia (user defined target).
I am recognizing objects as targets (followed this tutorial), but my virtual 3D object and canvas are unstable, and the extended tracking doesn’t work like in image targets.
Have some experience with this, does this happened to you sometime.
I will explain, I have a sculpture to recognize, and I tried AR-Media object scaning solution, but is the app becomes slow and also unstable, that is why I am using user defined targets to overcome my problems with object recognition.
No audio in android player? Do I need to do something specific?
There should be audio, nothing additionally is necessary.
why my appmanager.cs got error
#region PUBLIC_MEMBER_VARIABLES
public string TitleForAboutPage = “About”;
public ISampleAppUIEventHandler m_UIEventHandler; (The type or namespace name ‘ISampleAppUIEventHandler’ could not be found)
#endregion PUBLIC_MEMBER_VARIABLES
#region PROTECTED_MEMBER_VARIABLES
public static ViewType mActiveViewType;
public enum ViewType { SPLASHVIEW, ABOUTVIEW, UIVIEW, ARCAMERAVIEW };
#endregion PROTECTED_MEMBER_VARIABLES
#region PRIVATE_MEMBER_VARIABLES
private SplashScreenView mSplashView;
private AboutScreenView mAboutView;
private float mSecondsVisible = 4.0f;
#endregion PRIVATE_MEMBER_VARIABLES
//This gets called from SceneManager’s Start()
public virtual void InitManager()
{
mSplashView = new SplashScreenView();
mAboutView = new AboutScreenView();
mAboutView.SetTitle(TitleForAboutPage);
mAboutView.OnStartButtonTapped += OnAboutStartButtonTapped;
m_UIEventHandler.CloseView += OnTappedOnCloseButton;
m_UIEventHandler.GoToAboutPage += OnTappedOnGoToAboutPage;
InputController.SingleTapped += OnSingleTapped;
InputController.DoubleTapped += OnDoubleTapped;
InputController.BackButtonTapped += OnBackButtonTapped;
mSplashView.LoadView();
StartCoroutine(LoadAboutPageForFirstTime());
mActiveViewType = ViewType.SPLASHVIEW;
}
public virtual void DeInitManager()
{
// mSplashView.UnLoadView();
// mAboutView.UnLoadView();
// m_UIEventHandler.CloseView -= OnAboutStartButtonTapped;
// m_UIEventHandler.GoToAboutPage -= OnTappedOnGoToAboutPage;
InputController.SingleTapped -= OnSingleTapped;
InputController.DoubleTapped -= OnDoubleTapped;
InputController.BackButtonTapped -= OnBackButtonTapped;
m_UIEventHandler.UnBind();
}
public virtual void UpdateManager()
{
//Does nothing but anyone extending AppManager can run their update calls here
}
public virtual void Draw()
{
m_UIEventHandler.UpdateView(false);
switch (mActiveViewType)
{
case ViewType.SPLASHVIEW:
// mSplashView.UpdateUI(true);
break;
case ViewType.ABOUTVIEW:
mAboutView.UpdateUI(true);
break;
case ViewType.UIVIEW:
m_UIEventHandler.UpdateView(true);
break;
case ViewType.ARCAMERAVIEW:
break;
}
}
#region UNITY_MONOBEHAVIOUR_METHODS
#endregion UNITY_MONOBEHAVIOUR_METHODS
#region PRIVATE_METHODS
private void OnSingleTapped()
{
if (mActiveViewType == ViewType.ARCAMERAVIEW)
{
// trigger focus once
m_UIEventHandler.TriggerAutoFocus();
}
}
private void OnDoubleTapped()
{
if (mActiveViewType == ViewType.ARCAMERAVIEW)
{
mActiveViewType = ViewType.UIVIEW;
}
}
private void OnTappedOnGoToAboutPage()
{
mActiveViewType = ViewType.ABOUTVIEW;
}
private void OnBackButtonTapped()
{
if (mActiveViewType == ViewType.ABOUTVIEW)
{
Application.Quit();
}
else if (mActiveViewType == ViewType.UIVIEW) //Hide UIMenu and Show ARCameraView
{
mActiveViewType = ViewType.ARCAMERAVIEW;
}
else if (mActiveViewType == ViewType.ARCAMERAVIEW) //if it’s in ARCameraView
{
mActiveViewType = ViewType.ABOUTVIEW;
}
}
private void OnTappedOnCloseButton()
{
mActiveViewType = ViewType.ARCAMERAVIEW;
}
private void OnAboutStartButtonTapped()
{
mActiveViewType = ViewType.ARCAMERAVIEW;
}
private IEnumerator LoadAboutPageForFirstTime()
{
yield return new WaitForSeconds(mSecondsVisible);
mSplashView.UnLoadView();
mAboutView.LoadView();
mActiveViewType = ViewType.ABOUTVIEW;
m_UIEventHandler.Bind();
yield return null;
}
#endregion PRIVATE_METHODS
someone help me
Very nice. Where can I find / download the target image?
I am running NyAR4psg/3.0.5;NyARToolkit/5.0.9 in processing 2.2.1 with a Microsoft LifeCam HD-5000 on windows 7. When I run simpleLite, the background (camera) image appears only in the upper right corner of the window. It shows the lower left of the camera view. If the background image was correct the tracking appears to be correct. I looked in the reference material and found public void drawBackground (processing.core.PImage i_img)
This function draws the PImage to the background. PImage draws in part of farclip surface +1.
This function is equivalent to the following code.
:
PMatrix3D Om = New PMatrix3D (((PGrapPGraphicsOpenGLhics3D) G) .Projection);
SetBackgroundOrtho (Img.Width, Img.Height)
pushMatrix ();
ResetMatrix ();
Translate (0, 0, – (Far * 0.99F));
Image (img, -Width / 2, -Height / 2);
popMatrix ();
SetPerspective (Om);
:
My approach was to sub this code in for the line “nya.drawBackground(cam);” then mess with the translate to correct the issue. But I get a “syntax error, maybe a missing semicolon?” – I added a semi-colon to the end of the second line SetBackgroundOrtho (Img.Width, Img.Height); and It still hangs on the first line with the same error.
Any help would be appreciated.
Can you use the kinect’s rgb as the input video feed for marker based AR?
Hi,
Please help me.
I downloded the Augmented Reality Vespa User Interface – Mimic No. 1.. Really this is only interface. So I don’t test the projekt.
Where can I download the motor image?
The page updated with a tracker image.
hi, already send for inquiry for demo, hope to hear you soon.
thanks
Hello ,Recent how don’t have any video updates?
Hi, i need to know if i have to buy a 3d sensor camera for built a game with smart terrain or i can use the traditional camera of my smartphone? Thank you
You can use traditional camera.
Hello, interactive tutorial series can be a video presentation, charges can only look at some places don’t know much about the project
Hello Team,
Thank you for providing this nice platform, We are looking for a really good developer who can develop this paint functionality for us, We are already working on our product and need to integrate that part in it ( we are using Unity3D, Vuforia, C#).
The basic requirement, app should recognize/read the colors from the marker and apply it on the model itself.
Looking forward to hear from you soon.
Regards
ABID
P.S. I’ll be submitting few cool AR demos to this site, very soon 🙂
I just wanna say you are great.I have no words to than you.You are amazing.You rock.You are the best
Hi…
good job
Please, let’s education Ar toolkit
Hi, thanks for the tutorial!
I’m having an issue with the screenshot aspect ratio. When I take a screenshot (in landscape or portrait mode), the image comes out vertically stretched (or horizontally squished). I tested it in 3 android devices, same in all 3. The images come out normal when i take a screenshot in unity on my mac.
After a lot of research, I still can’t figure out the cause.
Do you have any suggestions?
Thank you
Hi hello good day, 🙂
Thank you so much for the tutorial! Really appreciated it. 🙂
But anyway, do you have any idea on how to reset the distance value once it is on “OnTrackingLost”?
Cause everytime I need separate it first(during scan the object), then the particles effects will be destroyed.
Or else, it will still remain on top of Image Target when I scan for the second round even I didn’t connect the paper.
I would be greatly appreciated if anyone could help with this problem. Thank you! :)
When I scan only one part of the image alone for the second round, the particles still sticks to the image even I didn’t pair up with another image target.
I’ve tried few solutions, but it seems too many errors come out,
One of it, I tried put parts of these inside OnTrackingLost() section,
”
string NameTarget = “imageTarget_” + mTrackableBehaviour.TrackableName;
GameObject target = GameObject.Find (NameTarget);
transform.position=new Vector3(0,0,0);
”
in order to reposition the sphere back to normal position when tracking lost, but it seems like not working cause I’m not pretty good in c# coding,
Hi
Can I use this solution in android apps?
Can MS Kinect drivers works on android with phone camera?
Did You test it?
Thanks for the answer a lot.
Hey Guys, i have used your tutorial to make the simple video playback app, and its working great, i just want to know, how we can change size of video appearing after tracking the image target ?
Please i need help
Where can i buy full version?
can you help me to create an app with AR
how to increase the rate of scaling videoplayback in unity . i want my scaling should be doubled how can i do this
how to increase the rate of scaling videoplayback in unity . i want my scaling should be doubled how can i do this ?
Did you use Particle WIFI Module with antenna??
It’s without antenna.
Hi,
Thanks for the tutorial, you make really easy tutorials.
1 question: with the current script, none of the UI element gets visible on the screenshot.
Is it possible to get a logo/Signature on the top left of the screen, so that that also gets displayed in the screenshot?
I tried but was unable to find success
why kudan package ont work on Unity 32 bit
I don’t know, probably low support, lack of human resources…
Could I get breadboard circuit diagram??
Hello, I have follow all the tutorial properly, but when I connect my laptop to kinect, the picture won’t open. I don’t know what happen, do you know to solve this problem?
Hello! Great tutorial, but when copy SnapShot.cs
CS0246 C# The type or namespace name “AndroidUltimatePluginController” could not be found (are you missing a using directive or an assembly reference?)
What to do now?
is this possible to making augmented reality based android application in processing 3 ? if this possible then how to do this?
I suggest you drop out Processing 3 and use Unity3D. Don’t waste time on Processing.
And one more , Can we use our own marker ?
can you please let me know if you have any apps for sale
Hi,im interested in how do that and May you make a tutorial about how do that?
how to export the AR app in unity, I always get the error message : Unable to locate Android SDK
Its because you need to install the sdks in the website of android.
Downloading android studio will help you then you need to locate inside unity the path
Thats it.
I try to run your project in unity 3d its amazing! Thankyou. But when I built it to an application, it can’t detection my webcam. Do you know why? Please give me an answer, thankyou
context.enableUser(); when playing this sketch at this line this error showing “The function”context.enableUser();” expects parameter like”context.enableUser(int);” ”
Please help me to retrive
Could you explain a little bit about the math for target tracking?
Many thx.
Could you explain about the math please ?
Great tutorial! how can I download the cube multi target ? I need to print the QC paper to create the cube to be able to try this tutorial!
Thanks.
I never success create AR files using vuforia and unity.. i use desktop pc dont have any camera can i do it with this specification Desktop PC, win 10, 16 gb ram
Nice 🙂
Assets/VirtualButtonEventHandler.cs(5,14): error CS0101: The namespace `global::’ already contains a definition for `VirtualButtonEventHandler’ What about this Error……………?
[…] https://www.ourtechart.com/augmented-reality/tutorial/kudan-slam-technique-pokemon-go/ […]
and can u give a toutorial that requires write c# code in order to control the virtual image??
I just think youre great.
hi sir
your tutorials are great. thanks for uploading…
can we integrate 2 or more videos with single Image target and make next and previous buttons to change between videos…
is it possible ?
thanks in advance
Hi there,
Thank You for the Video
I am new using Unity and all these stuff
I followed each step
But I had an error after i removed the Utility folder.in Minute 10:34
This is the error:
Assets/ZigFu/Scripts/Viewers/ZigDepthmapToParticles.cs(19,13): warning CS0618: `UnityEngine.ParticleEmitter’ is obsolete: `This component is part of the legacy particle system, which is deprecated and will be removed in a future release. Use the ParticleSystem component instead.’
Assets/Vuforia/Scripts/Utilities/VRIntegrationHelper.cs(99,29): warning CS0618: `UnityEngine.Camera.SetStereoProjectionMatrices(UnityEngine.Matrix4x4, UnityEngine.Matrix4x4)’ is obsolete: `SetStereoProjectionMatrices is deprecated. Use SetStereoProjectionMatrix(StereoscopicEye eye) instead.’
Assets/Vuforia/Scripts/Utilities/VRIntegrationHelper.cs(100,30): warning CS0618: `UnityEngine.Camera.SetStereoProjectionMatrices(UnityEngine.Matrix4x4, UnityEngine.Matrix4x4)’ is obsolete: `SetStereoProjectionMatrices is deprecated. Use SetStereoProjectionMatrix(StereoscopicEye eye) instead.’
Assets/ZigFu/SampleScenes/_Data/Scripts/MenuController.cs(75,16): warning CS0618: `UnityEngine.GameObject.SetActiveRecursively(bool)’ is obsolete: `gameObject.SetActiveRecursively() is obsolete. Use GameObject.SetActive(), which is now inherited by children.’
Assets/ZigFu/SampleScenes/_Data/Scripts/MenuController.cs(79,16): warning CS0618: `UnityEngine.GameObject.SetActiveRecursively(bool)’ is obsolete: `gameObject.SetActiveRecursively() is obsolete. Use GameObject.SetActive(), which is now inherited by children.’
Assets/ZigFu/SampleScenes/_Data/Scripts/MenuController.cs(87,16): warning CS0618: `UnityEngine.GameObject.SetActiveRecursively(bool)’ is obsolete: `gameObject.SetActiveRecursively() is obsolete. Use GameObject.SetActive(), which is now inherited by children.’
Assets/ZigFu/SampleScenes/_Data/Scripts/MenuController.cs(91,16): warning CS0618: `UnityEngine.GameObject.SetActiveRecursively(bool)’ is obsolete: `gameObject.SetActiveRecursively() is obsolete. Use GameObject.SetActive(), which is now inherited by children.’
Assets/ZigFu/Scripts/Viewers/ZigDepthmapToParticles.cs(49,32): warning CS0618: `UnityEngine.ParticleEmitter’ is obsolete: `This component is part of the legacy particle system, which is deprecated and will be removed in a future release. Use the ParticleSystem component instead.’
Assets/ZigFu/Scripts/Viewers/ZigDepthmapToParticles.cs(52,125): warning CS0618: `UnityEngine.ParticleEmitter’ is obsolete: `This component is part of the legacy particle system, which is deprecated and will be removed in a future release. Use the ParticleSystem component instead.’
Assets/ZigFu/Scripts/UserEngagers/ZigEngageSingleSession.cs(6,23): warning CS0649: Field `ZigEngageSingleSession.EngagedUser’ is never assigned to, and will always have its default value `null’
error CS1705: Assembly `ZDK, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null’ depends on `OpenNI.Net, Version=1.5.2.7, Culture=neutral, PublicKeyToken=6b43d0c6cf74ee7f’ which has a higher version number than referenced assembly `OpenNI.Net, Version=1.4.0.2, Culture=neutral, PublicKeyToken=6b43d0c6cf74ee7f’
C:\Users\Esra\Documents\Zigfu\Assets/ZigFu/Scripts/_Internal/ZDK.dll (Location of the symbol related to previous error)
Compilation failed: 1 error(s), 10 warnings
I find a solution for not disappearing canvas problem:
Add these codes which after the “//” to \Assets\Vuforia\Scripts\DefaultTrackableEventHandler.cs
Debug.Log(“Trackable ” + mTrackableBehaviour.TrackableName + ” found”);
// gameObject.SetActive (true);
and
Debug.Log(“Trackable ” + mTrackableBehaviour.TrackableName + ” lost”);
// gameObject.SetActive (false);
i want the photo for breadbord connections please
When i tried to run the demo i got an error vuforia initialization failed but when i opened the file in which the error was reporter visual studio is not reporting an error?
I am using Vuforia 6.2.10 in Unity 5.5 (This may be the problem but there are no error reports to indicate that something is wrong). What could be the problem?
I would say this is versioning issue.
Well now i have but a simple problem – kinect is not registering my face and is constantly running the default animations – could this be a problem with lighting om my face or could it be a problem with the scripts (kinect not getting the required info) ?
PS: Thance for a quick reply 😀
[…] OK, so back to what we do next with is “interface with our characters”. That’s right, I brought you to life, now speak to me! Love me! Give me attention, Pooh Bear, I MADE YOU! The user interface of augmented reality, especially for eyewear, is being developed now, and is ot yet ready, so, sorry but Pooh is still too dumb to play. The technicians and UI developers are working closer than ever as we enter an age of seamless integration between the game engines like Unity 3D or Unreal and SDKs like Vuforia. The SDKs help you get started from developer portal to final product. Check out tutorial here. Additionally, you can download Augmented Reality User Interfaces here. […]
Hi thanks. Very good tutorial. This works on PC but on android does not work. Pressing the buttons on android does not happen
hii, im realy interesting with this demo, can you make tutorial for this demo ?
why i can’t download the script?
why i can’t download the script? ty
Hello, because this one is for sale.
Hello, because this one is for sale.
Hi,
When I add select the Business Card for my image target, it does not display in Unity as that image? I have also not been successful adding my own custom images.
What am I missing?
How do you run the provided code?
Does it work with Kinect v2?
Unfortunately, no.
thanks for all i have a questions the size off app so big can we add link of the video ?? like save the video in youtube and see it with AR ??
whats type of cameras can be used for such projects
Thank you.
Thank you.
Thank you.
How to make the function button attack to attacking some card?
hello
can you clone this app !
i mean can you create a source code similar to this game
i will be the first buyer :p
i think it’s easy for you because you already makes an ‘Augmented Reality Coloring App’
but can you make a complete project for a coloring book with 6 pages for exemple
Hi, im need help, my aplication no play textures of the obj model.
Hi, I download the whole project to try this. But it has error in the script PolygonField.cs line 96: lineRenderer.positionCount = go_points.Length; It says “Type `UnityEngine.LineRenderer’ does not contain a definition for `positionCount’ and no extension method `positionCount’ of type `UnityEngine.LineRenderer’ could be found. Are you missing an assembly reference?” Could you please help me to solve this problem? I am using Unity 5.5
hi Helene do you solve it ?,i have the same problem could you help me
Thank you.
I have a question
How to find for BundleID and Wikitude license Key.txt?
Get it from wikitude website.
The snapshot button and exit button could work, but the share button is not working. Why ? what further checking need for verification ?
Can I use it on a laptop?
Can not it be 64 bit?
Thank you very much.
I have a favor as a CAD engineer.
Can you apply CAD modeling?
Please check the relevant links.
https://www.youtube.com/watch?v=CX0eVngKMH0
Thanks.
Where can i buy the full version tutorial?
Hello, if you are still interested it is within this project: https://www.ourtechart.com/product/mimic-4/
When trying to set the image target to the pre-decided target, the image only comes up as white. I’ve tryed various photos and digitial drawings as well as your target. Do you know wha the problem might be?
hello, can you say is it possible to change 3d model in the app. For, instance I will press button “change model”-> I will have to choose object-> this object will be in app.
app content link is broken. thnx
Hello, from what I see it’s working
Download # Archived App Content (*.m4v file video , C# scaling code, etc.) (*.rar file) link broken
Actually, it’s working. Starts downloading at once.
Hi. Can you please make a tutorial for this one please. Its really awesome. How did you do it man ? You are so amazing.
Hello. You can unlock the secret by buying this one: https://www.ourtechart.com/product/mimic-6/
why do i get a error while adding the same cs script ??
I am not able to download any the CS codes please help thank you.
how can i get/buy this project?
Hello, you can buy it here: https://www.ourtechart.com/product/mimic-9/
Hi , I can not download the source code. would you like to repair the download source?
I checked it – everything works. Which link is broken?
Hey, congratulations for your tutorials, you helped to community.
Do you have any sample with photo capture, when get de 3D Model.
Thanks
( *.rar file ) error when downloading nothing comes in.. please help
Hello, it is automatically downloaded in the background. Check your browser/downloads folder.
My image is not showing on image target
Can we click virtual button using mouse pointer instead of touching by hand, if yes please suggest me the way. Thank you.
Sure, just put the simple button, instead of virtual one.
Great tutorial! I just wanted to know whether there is any target limit in this case. I am planning to have about 50 targets in my project and is it possible? Thank you very much
In theory it should work ok.
hey i want ask something,when i change key license wikitude camera with my license still not work camera on my webcam,but when i change it into vuforia it can open camera on webcam,it seems error on wikitude camera maybe,isnt i true?
tnx . so cooool
Does BuildAR suport and display multiple markers simultaneously?
It does.
What are the marker sizes? Can they vary? Thanks in advance.
tnx coooooollllllllllllllll .
we stand for new tuts .
tnx
Is it possible to make a windows build from this ?
Hello, yes that’s possible. You can test it here:
https://drive.google.com/open?id=0BygvzTqnzm_wTnFLQm4xZmFaeUE
can working on animation images ?
Hello, sure, green color is filtered.
Hi – Just wondering if this works in 5.6.3?
Your project file works in 5.6.3, however, when I build my project based on your parameters, I can’t seem to make it work at all:)
Is there an update?
Hello, what’s the error?
[…] We share the data. And you?Hit like button and share with everybody!More data on this Augmented Reality tutorial: https://www.ourtechart.com/augmented-… […]
How i can implement the virtual buttons like Air Tap not a static buttons because the number of buttons changes with me on runtime and when a user point their finger above the button so i want to call the function of that button.can you help me thanks
Hello, Matthew did a great work here: https://www.youtube.com/watch?v=Fgd21lbhikU
why i cannot load my database using vuforia-samples-advanced-unity-5-0-10?
Please help
Hello!
The Vespa User Interface also works on IOS?
Can I change the vespa to another character?
Thanks,
Fábio
Hello,
Thank you for your interest. It works on iOS platform. Currently code is adapted to this specific Vespa model, so most likely you will have to make minor code modifications. Documentation is provided within the project after you buy.
Regards,
Edgaras Art
Edgaras,
I’m not developer. If I need some help, do you support?
Best Regards,
Fábio
sir.. what is Model on augmented reality?
name object?
name scene?
or what?
thanks
public GameObject Model
GameObject.FindWithTag (“Model”)
I do provide support for all the projects you buy from me.
can u teach me how to do the android plug in? the share button doesnt work 🙁
Excellent!Can you show tutorial you used to do this other on Unity?
Sorry, this one is only for sale.
Okey,If allowed can u tell me what kind of plugin can achieve such a function?
Hallo I have just sent a comment but I cant see it here?
I am following your Blog it is fantastic
I am making an Augmented Reality app, I want a help in my code please
how to make a VirtualButton When it pressed It will play specific animation on the charachter
I want to make 3 buttons with each one there is different animation.
thanks in advance.
Hi, you can follow this tutorial:
https://www.youtube.com/watch?v=Fgd21lbhikU
When I first created the tag “Model” and ran unity, everything worked! However, I then added another tag and called ‘Model1’ and deleted my tag ‘Model’. I then realized that the tag ‘Model’ was part of the code. I added back tag ‘Model’ and assigned it back to my model. However, the buttons don’t work anymore. I only have 1 tag in my tags, and that is ‘Model. Please advise? Thanks!
Note: I just realized that the object is scaling (as being shown in the panel under Transform) but not scaling on the screen.
Do you use Vuforia?
Yes
how or where can I find study source on vuforia and unity?
You’re here 🙂
Do you recommend doing tutorial to tutorial?
For you? It’s up to you.
Yes
That’so great i learn so much
hi
how i can download old sdk for unity ?
Unity has all it’s old versions on their website: https://unity3d.com/get-unity/download/archive
how i can download old vuforia sdk for old unity version?
Do you have all version of vuforia sdk in your website?
Yes, the Legacy: https://developer.vuforia.com/downloads/sdk
No, I don’t have all versions of Vuforia.
Hi,Can you put a tutorial on object tracking and how to track object and using the tracking we can change the model color and our things .Please go through the
following link: https://www.youtube.com/watch?v=c_jFgEiFotE. Please help me sir,if you put tutorial on this it will make more helpfull for me.
Really great! Is it also possible to add kind of a zombie overlay to the face?
Yes, but a little bit different approach would be used to make it as a “mask”
can you send me the vuforia package (Vuforia 6.0.117) please? thanks.
can you send me the vuforia package (Vuforia 6.0.117) please? thanks.
Please use 6.2 should work the same way:
https://developer.vuforia.com/downloads/sdk
Hi there,
First of all i want to thank you for publishing great tutorials. My question is: How can i establish the data transfer between arduino and an android device instead of the computer so that the arduion device can intrepret the data coming from the arduino(via bluetooth or wifi) and manipulate the AR model? maybe it would be too complicated to explain it here but do you know any tutorials or sources i can use. Thanks a lot.
Nice augmentation. I also saw your Model based tracking for car wheels which is based on VisionLib SDK. Do you think combining Vuforia VuMark tracking and Vuforia Model tracking in one Unity scene is possable?
There has been a discussion in year 2013 on Vuforia Developer Portal regrading User defined Targets and Image Targets in the same Unity scene which was not passable to run. I tried but also ca not get it to work.
Hello.
It’s really hard to tell, I haven’t tried combining. But you wan’t to use both at the same time, not separately switch trackings?
Regards,
Edgaras Art
Yes, I have to use the User defined Target as a fallback in case of Image target is not detected.
I really don’t understand what you mean by that.
If for example a model or image target is not recognized (due to bad environment conditions like bright or low light), the user should get the possibility to enable augmentation anyway. Hence he should be able to start User Defined Targets to track he scene and start augmentation.
That’s not entirely correct, how User Defined Target work. When you talk about User Defined Target you probably think more of a SLAM technique (which is more of what ARKit and ARCore offers), but this is not correct. With UDT you just can do the snapshot in real-time and augmented the content without predefined Vuforia image database.
In Vuforia you can enable or disable SLAM for all kind of Target Type independent of Type. So there exists Model Target with or without SLAM, UDT with or without SLAM. Image targets with or ….
The Question is if it is posable to have more than one target type in one scene. E.g. if Model target fails to be detect the user could press a Button that grabs a snapshot and augment the content to the current scene. For example for your car wheel demo: If it fails the user could align the wheel, presses the UDT Button, a snapshot of car with weel ist taken and the content will be shown attached to snapshotr. Of cuse it will not work for the next car. But at least or this one.
Extended tracking is not SLAM.
The Question is if it is posable to have more than one target type in one scene. E.g. if Model target fails to be detect the user could press a Button that grabs a snapshot and augment the content to the current scene. – this would depend on the environment itself it would not work 100% in all cases.
The Question is not if it might detect the snapshot – that may depend on environment – the question is if it is passable by Vuforia Sdk. As I already mentioned in former versions of Vuforia sdk either nothing worked at all or the application just crash. But you already gave your answer with the first reply “I haven’t tried combining”. May be this would be a challenge for you to get it to work…
Alteranative model Targetcs could be used but again I need UDT as fallback
in fact they dont have to work at same time because the one is the fallback. But they should be in the same scene!
Not Work same error positionCount how can fixed ?
I am unable to scale or move my object through touch, what is the reason behind?
You mean you can’t scale using Sliders?
yes i want to scale object through my finger touch, plus i am also facing issue regarding the position of my object some time it lies in the air instead of appearing on a flat surface.
Hi 🙂
Can I do something like this using JavaScript and Vuforia Web Services?
Thanks!
I believe VWS works only for image targets.
P.s. Hi!
its not working with vuforia-unity-6-2-10 and unity 2017.3
help me
Lot’s of things has changed since then. If you use 2017.3 with integrated Vuforia, then some Vuforia files duplicates and you get error. I suggest you try to recreate freshly based on video, but using newest version Unity, don’t import Vuforia 6.2 anymore.
You’ll have to add that function by yourself. If the model is floating in air, maybe your device don’t have a gyroscope?
gyroscope sensor is present in my device
Hai.. Is there any tutorial trying to implement a way to produce a sound such as AR music instrument when a user put his finger on the virtual button. Do you think that this It is possible to do this ? Tq so much.
Hello. Of course, that’s possible, nothing fancy here. There are some with 3D models, so based on that you can adapt it to your needs.
will you please help me with that function?
Hi great tutorial, but the links for the can downloads are all dead?
Hi. which ones are dead? It’s working for me.
please need the full a-z tutarial for this cloud recognation based ar
thank you
Hi,
I would like to do an AR Tourism Booklet something similar to the video.
Can I import my own 3D models? Because I have my own customised buildings and characters so is it possible to import them?
If yes which format should I save for? .FBX ? I use Autodesk Maya .
Do you sale a plugin or something to make this task easier?
How to test the above application without having Kinect? Please tell me the alternative of Kinect, I have a deep sensing 2D/3D camera! Thanks
hey bro please help me to create video cloud service using you tube URL
This Mimic 15 project will do the job for you!
Hi, Thanks for AR Tutorial. Could you please tell me how I would be able to make this example as an Android App?
Hello, take a look at Unity3D + Vuforia plugin. Once you have thee tools installed, you can have something running in about 20 minutes (from Unity3D editor) that would wokr the same way on Android/iOS platforms once the solution will be exported. Good luck!
Hi,
thank you very much for your tutorial!
I am a newbie concerning augmented reality, but may I ask you if it is possible to create a sphere instead a can!?
My idea is to show the earth (globe) with a geostationary satellite (augmented reality)..
So, if the globe is rotating, the satellite should rotate too.
But I am not sure how to start because the satellite has to be always over the same spot (for example Europe).
I think I have to make a connection between the 3d model (satellite) and the texture of my globe (Europe), but I don’t know how to do that…
I believe that the introduction of augmented reality in publishing and education is really a breakthrough. This stimulates and attracts the attention of students in the learning process. I think you will be interested to read about the role of augmented reality in publishing ( https://invisible.toys/augmented-reality-for-publishers/ )
Can you please share Android Ultimate Plugin Controller to, I’ve been search in every sites, Ai’nt get it -___- Thx
Excuse me, that proyect is made with unity 3D? I want to buy it but first i need to know that and if it is compatible with android and iOS
Hello. Thank you for your interest in my project. Yes, it is made with Unity3D and both of these platforms (Android and iOS) are supported.
Is the source code usable on 2017 /2018 unity version?
Can i use it for commercial pourpose?
Tks.
Just checked on Windows Unity3D 2018.1.0 – works fine. Sure, it can be used for commercial purposes, just keep in mind, that you need to buy license, if you want to eliminate XZIMG watermark.
Thank you so much for info!
Br.
Gianfranco
Hi, thanks for this work you show us. I have a question, I have created two scenes with models and the buttons to scale, rotate and change scenes. In the first scene, I have no problem, the buttons work properly, but when I change to the next scene the buttons don´t do anything and the rotate buttons work but the are spin around, only works the previous scene button. As I did in the first scene y tagged mi model as “Model” so, I don´t know what´s the problem. Please help me with this. Thanks in advance.
hey this is not from this video but i had to ask… how can i implement multiple virtual buttons in scene . basically i want to see the c# code for the same . thanks
Thanks a lot for the answer.
So if I pay, can i download the entire project to take it like a base for other project?
Also… Do you includes some guide or something to understand how it was builded?
After I pay, could you solve doubts and give some support if i´ll need?
Thanks a lot and sorry if i got many question but I´m so interested but I need to be 100% sure.
Hello,
So if I pay, can i download the entire project to take it like a base for other project?
Exactly, I was making this project for such purpose.
Also… Do you includes some guide or something to understand how it was builded?
Documentation is included within the project explaining the scripts that were used, how to build AssetBundles, where to change Vuforia client and server keys so you could work with your Vuforia Cloud Database (account).
After I pay, could you solve doubts and give some support if i´ll need?
Of course, I’ll answer to any of your question you’ll have
Hi.
Excellent work!
Do you know easyar?
Do you think you can develop the same features with it?
Hello,
Thanks! I’ve been testing EasyAR a while ago I haven’t used Cloud recognition on it, but I don’t see a reason why it couldn’t be adapted accordingly.
Hi.
I could ask you how much it would cost to make it for easyar and how long would it take you?
Thanks
¿tiene plataforma Web en donde se administra?
It’s done through Vuforia portal.
Could you please update this because unity 2018 and wikitude 8 are very different now even the scripts have change
Yeah it’s already obsolete.
hello. I already created this. but why it is so slow in my device? I used samsung galaxy A5 (2015).
Hi, thanks a lot for your answers.
It´s confirmed, I will purchase the mimic project during this week.
After i purchase the project, I hope you could help me if i´ll need.
Greetings.
Hi everyone , i’m trying to applied course for tracking image but the target image is appear white solid color , so what is the problem ??
Hello! Does it works with last Unity version?
Not the latest, but I’ve remade it from Unity3D 2017.3.1 to Unity3D 2018.2.4 version (on this latest version selection from gallery and screenshot functionalities are disabled at the moment – other than that everything else works smoothly)
When i try load the project, the information appear:
[VisionLib] FATAL: Windows: Could not load filename:C:/Users/eduar/Documents/motor visi/Assets/StreamingAssets\VisionLib\TutorialModelConfig.vl
help me
The app is very cool
I’ll buy this
But I have a query. Is it possible to hide the logo Vuforia from camera ?
Hello, thanks! If you want to hide Vuforia logo, you need to buy a license.
Hi, I’m interested in what you have developed here, facial tracking. I would like to buy this project in unity to adapt it to what I really need. I do not see the price of that here, do you sell it?
Hello,
You can find it in ARShop:
1) https://www.ourtechart.com/product/mimic-11-ar-face-tracking/
or
2) https://www.ourtechart.com/product/mimic-14-augmented-reality-face-replacement-make-project/
hi..
nice tutorials..
just two things.
1st- how can i save multiple screenshots together.
2nd- and how can i save images in gallery..
P.S. i only want the screenshot option but not share one. so please help!!
thanks
Hello. I want to ask is that possible to create this application by using free version for learning purpose? I tried to built in android but it does’t works, it comes error. I looking forward to hearing from you. Thank you very much!
Hi. What you see here is a free version. Hard to tell you the cause roots of what is not working properly from your side, but I can assure you that this project works great and you will receive the support from me if you buy this AR project.
Can you create a short tutorial on how to plugin the free version AF-1.5.2-Trial into unity for android only? I’m really want to learn how to create it.
Sorry, but I don’t have such plans at the moment.
Alright, never mind. Thank you.
I didn’t understand the need for making the above project,please explain me in brief.
A domino game tutorial?
Hello,
I work for the Brandenburgische Technische Universität Cottbus-Senftenberg, based in Cottbus and at the moment in COCKPIT 4.0, a research project for innovative automation solutions for small series assembly in the aviation industry.
Searching for a solution I found your tutorials and I would like to discuss further with you.
Kindly contact me.
Best Konstantinos.
Hi. i am not getting any error but when i start play . canvas image is appear on screen and then i scan marker its disappear. and when i remove marker then again canvas image is appear. how can i solve this please suggest me.
and thanks for awesome tutorials i am following your channel last 1 and half years. thanks
Thanhks so much <3
can u please upload the .cs files here ??
Hello, this project is for sale, therefore, script is not publicly available.
Hi,
Thanks for the tutorial.
I am trying to do something similar using ground plane detection and only placing the dominos one at a time (not using the Add Multiple script). My domino base has disappeared and the objects fall from above rather than being able to place them where the base should appear.
Do you know why this might happen?
Thanks again.
Hello, falls from above to your ground plane or infinitely?
Hi! Thanks for replying.
Falls from above to the ground plane. I also changed the scale to 0.2 because they were huge otherwise – I think that might be the problem?
Hi,
Would this also work if AR was built using Vuforia in Unity and then deployed as WebGL? then the WebGL was uploaded to website?
Regards,
Ebrahim
where are the files you add into unity ? such as the buttons ect i cant seem to download them ?
hi I`m trying to download the content you put into unity but cant seem to for some reason , is there anything I need to be doing in particular to use the files.
The link to download the content is working, I’ve just checked.
hi thanks for checking , ive sorted it . great tutorial 🙂 can I ask if I wanted to create content from scratch do you use unity for that ?
Hello,
I created Userinterface according to your tutorial. All buttons except Scaling buttons are working fine. In the scale down button, If we keep on pressing, the model will be tilted at one stage and be zoomed. How can we resolve the issue to have scaling down at minimum level without tilting.
Thanks
Gundus
Hello,
First of all, your videos are great and a good source of inspiration.
I would like to ask you, did you have some code sheet that I can have for example ?
Thanks you 🙂
Sorry, this is available only as a demonstration.
Ok no problem 🙂
But still, did you use Vuforia in order to make this ?
Thx you for answering 🙂
Yes, for card tracking Vuforia was used.
Hello
You use “cloud image target” or images targets are already in the unity project?
Because I would like to add target images in my application using only vuforia web services without updating my application.
Thanks
Hello,
Cloud image targets are used. This project exactly fits your needs.
Regards,
Edgaras Art
Hey, nice project but when i click on the link to download , it starts to download another project specifically the WikitudeMultipleTracking. So please fix this, others can have access to this project.
Thank You!
Hello, thank you for pointing that out. The URL is already fixed. Good luck!
Hello Edgaras!
The URL for the APK is not fixed yet.
Hello. Are you sure? It’s working fine for me.
Oh, wrong apk file either, fixed it already.
I created the apk from the project but the effects are appearing like popcorn, not only when I open my hand. I opened it on 2018.2.18.
[…] More info on this Augmented Reality tutorial: https://www.ourtechart.com/augmented-reality/augmented-reality-tutorial-unity3d-vuforia/ […]
hi,how to add xr setting in player setting?
thank you
great tutorials, how easy is it to create the files to use in unity ? and what file types are they?
Good
how to get the car controller and hidecar body scripts
[…] Tutorial details are available here: https://www.ourtechart.com/augmented-reality/ar-model-tracking-visionlib-sdk/ […]
Can you share me the Meta SDK download link for Mac OS X?
Sorry, but I haven’t saved it.
nasıl satın alırım ? ı need project ?
[…] We share the knowledge. And you? Hit like button and share with everyone! More info on this Augmented Reality tutorial: http://www.ourtechart.com/augmented-reality/augmented-reality-user-defined-target/ […]
hi am safiq from india i need this FT project for unity please give
Hi. Sure, you can get it from here: https://www.ourtechart.com/product/paparmali-2-ar-face-tracking-using-kinect-2/
[…] This is Augmented Reality channel in which easy-to-follow video tutorials are provided. In some cases interaction devices and various Arduino-based sensors are used to make it possible to manipulate the virtual content in Augmented Reality. We share the knowledge. And you? Hit like button and share with everyone! More info on Augmented Reality technology: http://www.ourtechart.com/augmented-reality/demo-augmented-reality-technology/ […]
Hi, I am trying to run vuforia on Kinect 2 in unity. Can you help me to how to implement that.
Please do not follow this tutorial anymore, it is deprecated. New approaches for that matter should be used one of which kinect 2 sdk on unity asset store to achieve something like this:
https://www.youtube.com/watch?v=ksKxN6aVg6s
and this:
https://www.youtube.com/watch?v=JARnCPKrpZU
or this:
https://www.youtube.com/watch?v=yGzgi_RQrCk
I think your app is really awesome. Wanted to know if you have any lexus models.
Thank you. Unfortunately, I don’t have it.
I need to know more details of the software and it’s capability and how can it be applied in the market nowadays through AR Solution.
As mentioned in the description – such an AR solution is a perfect fit for events, where lots of people are gathering – it would attract a lot of attention to you and increase your brand awareness. It depends what message you have, people could have snapshots and share it on social networks with your watermarked brand. Of course, another option is to use as a Virtual Dressing Room, which means clothing shops would offer the possibility to try on clothes before clients buy. Such an innovative approach to a client would boost clothing company sells.
Does this project include hybrid tracking using kalman filter?
Good question, you should ask Vuforia creators whether they use Kalman filter.
thankyou somuch !!
I want to create an AR card application that I automatically update each time I add a new client card.
How can I get there and how many different business cards can I add each month?
is it possible to do with cloud solution or firebase??
Hello, Mamadi,
This Paparmali 3 project (app) would allow you to do that. Overall you can keep 100.000 (100k) image trackers in Vuforia Cloud Database. You don’t need Firebase for that, unless you have some other specific needs on your mind.
Regards,
Edgaras Art
did unity remote settings is supported by your app.??
how can i buy it?? give me the link
«Since 8.1.7 Vuforia version Android platform introduces a bug that causes each model/texture to be red – fix in progress… »
Fix in progress?
Yeah, still in progress.
Then I look forward to it.
The issue is already solved on Android. You’re good to buy it!
I want to pay this projet but paypal is not working.
There is another to buy it?
So far current clients didn’t had any issues processing payments over Paypal method provided within this website. At the moment only Paypal method is available. For more details let’s discuss using email.
Hello, so, I am finishing my thesis project, and for this I need to learn the way this app was made, I want to know how much they would charge me for teaching me how to do this, I am not a programmer, I am 3D modeling and I want to learn this from scratch. I am looking forward to your response.
I will respond to you through email.
Hello. I sent you a letter about the application, please reply.
I’ve already responded to your email. Let me know if you have any further questions.
where can i buy the license for eliminate the watermark
Hello.
xzimg.com
hi can you share it with me
if i want to add new target and content,
i don’t need to use unity 3d to add and update contents ??
do i need to host or buy a domain to upload my images
You can upload image targets and assign urls to content (url to a model (AssetBundle), url to an image, url to a video etc.) from iOS, Android apps and from the Unity3D as well. But keep in mind that you would need to export and publish these apps on your Google Play/App Store accounts. You need to have a place (server) where you keep all your stuff, unless everything is already on-line (youtube video, image from website, etc).
Image targets with metadata are uploaded to Vuforia Cloud Database – documentation is provided within the project how to set it up. Additionally to that, I upload target image to my server as Vuforia doesn’t provide direct access to uploaded target image itself.
Your test application zip is not opening.
Do you mean you can’t download, or can’t start exe properly?
I’ve just tested, it opens up just fine, don’t forget to extract it before trying it out.
Hey its not working when I downloaded it its shows rar of 1.19GB and while unzipping it shows error
Hello. Thank you for your interest.
I have just re-tested everything just in case: re-downloaded, un-archived and started the solution successfully. Are you sure you have enough space on your HDD/SDD?
Can I have this Game
Please contact us via email.
Hi, I’m a photo booth manufacturer, in my country Brazil (South America). there is the interest of having a distributed of yours in my country, pabrico an equipment that runs its software.
Att Uilton.
Hello, I sell only software, not the hardware. The application runs on a PC + Kinect 2, you can download and test it. Link is provided in the description. For more details please contact me through contact form.
I am a student. I want to create a AR book for my little brother. so can you please share the tutorial.
Please contact me using contact form.
Do you teach how to create this on unity do you have any course package?
Hello. At the moment I don’t teach this.
Hi,
Can you do this with unity ?
https://www.youtube.com/watch?v=ZgiQ2-OAnJQ
Wow, nicely done, I love it! Yes I could do it using Unity3D.
Hi,
I want to do the same thing using Intel Real sense camera D435
is it possible ??
Hello,
In theory, yes, I can see that it has body tracking feature.
Nice! Did you developed in Unity and Vuforia? How did you animated the house?
Hello. Yes, Unity and Vuforia. Actually, on the ground it contains a masking plane and the house parts are below. On touch house parts goes up (above the masking plane.
[…] More information on Harry Potter alike AR tutorial: https://www.ourtechart.com/augmented-reality/ar-tutorial-harry-potter-game-object-tracking-based-on-… […]
If I purchase it, does it come with the source code to be manipulated in Unity?
Thanks.
Keep up the great work!
Hello, yes it does, the scene and documentation is provided within the project. Also, If you purchase it, I will provide an update of this project for a recent Unity3D 2019.1.x (or above) version within a week.
[…] You’ll find more info on this tutorial here: https://www.ourtechart.com/augmented-reality/tutorial/ar-slam-wikitude/ […]
I am a student. I want to create a AR book for my little brother. so can you please share the tutorial.
Sorry, but the project is only for sale.
I would like to thank this service for a great product!
We bought and are very satisfied with the source code, there is all the necessary functionality, all that we need.
Watched a bunch of tutorials on YouTube, tortured standard packages, combined, spent a lot of time.
Incidentally exile on this service we on YouTube and found.
And here is ready with all the necessary functionality, even more than we need, we have already redone everything for themselves.
Play Market is already checking our release.
Support at a high level, if it is not clear how and what, you will be prompted where.
We bought the Assembly paramali 2 and while remaking it for themselves, we were sent an update paramali 3 without surcharge.
If gnawing doubt whether it is worth it-believe me, it is worth all the money spent on this source code, because it saves a lot of time.
Thank you Edgaras Art for awesome Assembly!
I would like to thank this service for a great product!
We bought and are very satisfied with the source code, there is all the necessary functionality, all that we need.
Watched a bunch of tutorials on YouTube, tortured standard packages, combined, spent a lot of time.
Incidentally exile on this service we on YouTube and found.
And here is ready with all the necessary functionality, even more than we need, we have already redone everything for themselves.
Play Market is already checking our release.
Support at a high level, if it is not clear how and what, you will be prompted where.
We bought the Assembly paramali 2 and while remaking it for themselves, we were sent an update paramali 3 without surcharge.
If gnawing doubt whether it is worth it-believe me, it is worth all the money spent on this source code, because it saves a lot of time.
Thank you Edgaras Art for awesome Assembly!
Thank you so much!
rar file is not opening ,i am getting error “file is not comressed”
Hello. Weird, I can’t replicate the issue. Please try out this zip archive: https://drive.google.com/open?id=1AI5ILi6y_XAemnGd_OWTmcz3YJpHnDcC Please let me know if now everything goes smoothly.
Hii,
Your project was Awesome!!!
But I need one help can you tell me how to render 3D object on body.
Thanks in advance.
Waiting for your response……….
Hello! “was”? well, it still is! 🙂
As for your question, firstly, you need to track body or to be more specific – body joints. By “tracking” we mean having 3 axis position and 3 axis orientation information of each body joint. Having this information you can then “put” virtual content on top of human body.
Regards,
Edgaras Art
hello, do you sell the project for unity ?, is that I want to add my own costumes
Yes, this is a Unity3D project for sale. You can add any costumes you want. The main thing is that the costumes (outfits) must be rigged.
is that I want to add individual things like 1 shirt, 1 dress or 1 pants, could you?, and you when delivering the project explain that it must be modified in the mesh to make it work?, the suit is accommodated depending on the size of the person?, also serves as a profile or only of fr Entity?
es que yo quiero agregar cosas individuales como 1 camisa, 1 vestido o 1 pantalon, se podria?, y tu al entregar el projecto explicas que se debe modificar en el mesh para que funcione?, el traje se acomoda dependiendo el tamaño de la persona?, tambien sirve de perfil o solo de frente?
Yes, it is possible to add shirt, dress or pants, but it is important that they were rigged properly. I do not provide explanations how to rig outfits, that’s something that 3D designers/modelers are familiar with. And yes, the shirt/dress/pants or any other outfit size is adjusted to a person size. Not sure on what you have in mind with the last part of the question?
Whats the price of this project?
Please contact me over email.
Hi
Great job with this application.
One question though, If I want to purchase the fitting application only (without models and effects) would that be a different price?
Thnx
Hello,
Thank you.
As for your question: We do customize project(s), but only to a one direction – forward, by expanding the project and customizing new outfits. What you ask is a backward direction and we don’t sell empty projects.
Regards,
Edgaras Art
That’s great, I want to buy one for learning, Keeping!
Hello, of course! P.s. I’m working on a new Face Tracking project using ARCore. That project will include 3 options: 3D masks, 2D face filter, camera effects. It will be much more advanced, It will be able to use all 3 at the same time, put some 3D hat, put, for instance, animated blinking eyes on your face + add some camera effect of an “old movie”.
Great project! Support at a high level!
Thank you!
Hi Edgaras!
How much for this game?
Best regards,
Fabio
Hello! Sorry, but this AR game is not mine.
Hey,Greaat job!
But when i start the game it writes that “speech recognition is not supported“ and eveyrhing in white (screen)…..
Hello. Thank you! Are you sure your Kinect 2 is connected properly to your PC?
mind you share tutorials on how to color the model while it is moving?
Thanks
You mean while the model animation is triggered? There would not be any difference from perspective of coloring whether it is a static model or animated one.
Is there any tutorial of it? if yes, is it possibile to have it?
Hello. Unfortunately, there is no tutorial.
Thank you for the reply, if I buy the software can I have also the project for customize it?
Yes you can! Documentation on how to do it is provided within the project.
Seems Kinect 2 is out of production, Can I buy azure Kinect dk?
Really? Well you can find plenty of Kinect 2s’ on Amazon. Of course, I’m planning to add support for Azure Kinect DK in near future. Please keep in mind that Azure Kinect DK costs much more than Kinect 2.
Unfortunately in Italy there are none but because we’re planning to buy tons of it, we need something stable and reusable.
Did you had stability issues with Kinect 2? Is Azure Kinect DK already for sale in Italy?
Hello, is it possible to branch your project to adapt the source code to other RGBD cameras with different specs (eg: higher fps, or pixels) other than the kinect V2?
Hello. Unfortunately, no. It is possible only using Kinect 2 and once I’ll take my hands on Azure Kinect DK we’ll add support to it as well. At the moment it is available only in USA and they don’t ship to other countries.
Hi, you have done a great work. I am working on ARcore since couple of weeks. I have some doubts regarding it. I want to know how to create a package using arcore unity sdk for developing FACE TRACKING. It would be a great help if you can throw some insights on it. Thank you in anticipation .
Hello,
Could you be more specific with “create a package”? What do you have in mind? *.unitypackage?
Regards,
Edgaras Art
Hi, I actually saw a video on youtube where we can create SNAP FILTERS in UNITY and deploy it into android mobile. They used “arcore-unity-sdk-1.13” version from google sdk. Where they have imported the google arcore sdk and used some default examples to get those filters. I want to use some other example features like identifying a human face and putting glasses. So, my question is how can I make an example package in google arcore.
Here’s the link which I refereed through
https://www.youtube.com/watch?v=gbKcdNEvuGc
Hi, just start with the sample scene provided by ARCore.
Hi,
How is it going?
How to create a new package in google arcore sdk?
*Did you get any idea regarding how to create a new package in google arcore sdk
Hello. The thing that confuses me in your question is, that you mention ARCore SDK. ARCore SDK has no relation to the package creation in general. If you want to create packages, take a look here: https://docs.unity3d.com/Manual/AssetPackages.html
Hi,
I want to build an arcore face tracking app as you did, but I want different types of spectical filters. Can you please guide me how will I get that.
Hi, it’s possible run this app on Kinect Azure?
Thank you
Michele
Hello,
Not yet, but we have plans for it.
Regards,
Edgaras Art
Hello. Azure Kinect DK support was already added to Paparmali 4 – AR Superhero Outfit project!
Thanks for share. Im have project AR with kudan sdk
How can i built it in android studio?
Hello. You can’t, this is a Unity3D project.
Hi ,
I have tried to follow this tutorial . Everything worked fine except C# script . I am getting “The type or namespace name ‘VirtualButtonAbstractBehaviour’ could not be found (are you missing a using directive or an assembly reference?)” error in my script .
Do I need to add any dlls exclusively ?
Hi
congratulations!!!
i would like to have the code source of AR Vuforia Cloud Recognition, how to get it?
Sorry, i am frech speaker an dlive in Africa
Hello. You can get it from here: https://www.ourtechart.com/product/paparmali-3-ar-vuforia-cloud-recognition/
Very Interesting Demo. I really appreciate you to publish it publicly. Can we use WEMOS D1 Mini board or NodeMCU board to replace particle photon board? If I use Wemos or NodeMCU, can you please give me a hint or guide to complete same project you published here. I just want to real time monitor temperature, humidity kind of parameters using AR based App and simple development board like Wemos. Actually I cant afford hololens or particle photon board. If you can help me , it would be so great. Thank a lot again
Hello. Sorry, but I won’t be bale to provide you any suggestions while using other micro-controllers.
Hai. I’m doing my final year project about AR Colouring Book this year. May I know how to make this application? Hope that you could reply to me soon.
Hello, you can get this project from here: https://www.ourtechart.com/product/mimic-3/
Hey can you tell how we can prepare the texture for 3d model to warp around it perfectly I am working on similar project but I need a steps to create a texture that wraps around 3d model in the way I want.
Hello. You can follow this tutorial on applying texture to a model: https://www.youtube.com/watch?v=32lQxcIjxMU
Hello, I am so impressed by your amazing work and when I was watching this video I was actually thinking how the back of the body looks like because atm it is showcasing all the front tracking. Thank you !
Hello. Thank you. You can download a demo and test it on yourself how it would look.
DId You add support for Azure Kinect DK ?
Hello, not yet. Microsoft doesn’t ship it to my country yet.
Hi, did you manage to get a build on UWP or Windows devices in general? Or did you only try it in play mode? If you managed to get a build, please let me know how!
Hello. No, I haven’t exported for UWP.
is it on monitor screen or in real world?
Hello. In real world.
Hi, May I ask what if I would like to change the background as well? is that possible in this project? – thanks
Hello. Yes, that’s possible!
Hey, I’m doing this kind of project for my final year thesis, and I need to do this from scratch, can you help me please ?, I ask many people but the cost is too much for me to handle, anyway thnak you so much
Hello, Francis,
You can acquire this project from here: https://www.ourtechart.com/product/mimic-3/
Let me know if you have any questions.
Have a great day!
Hi, could you develop a similar app for iPhone? Would you like to collaborate?
Hello. Yes, I can. we can discuss your specific app requirements over email
Where the tutorial for make this game?
Hello. No tutorial exists, but the project could be sold. If you are interested please contact me using contact form.
How do you get the eyewear for this reality?
That’s using an app on your mobile device.
Hi, is it possible to use this software and camera as vertical? like 1080 x 1920 resolution?
Hello. Yes, it’s possible!
hi, I love your project. <3
I'm a beginner and want to learn more about augmented reality. I want to build an application on flutter and make something like this on unity3d, than integrate this app with flutter. Is this possible? I would love to hear from you 🙂
Hello, thank you!
For the sake of your nerves and time skip flutter and build everything in Unity3D. In my opinion, no need to involve flutter in whatever you are building or you will go nowhere and your development time will sky rocket. In either way, everything is possible.
Have a great day!
How do I download after payment?
I just paid for Paparmali 5 – SmARt Mirror (Virtual Fitting Room) – Kinect 2 / Azure Kinect DK Body Tracking €995.00 EUR
30/3/2021
Thank you for your interest! Access to the project was provided couple of minutes ago.
Hi Edgaras,
Does the fitting adjust accordingly to the size of the person? what if it is a child or a person smaller size or the opposite?
Thanks!
Hello, Justin,
It would adjust automatically. You can check it out – executables are provided for testing.
Have a great day!
I need this project and I want to learn how AR App develope.
Hi Edgaras,
Great work.
I am an engineering student and also working on AR in unity.But I am facing problems in making UI or interacting with UI which you have done in impressive way. Can you please guide me how did you make GUI for that app or specific approach you used.That would be a huge favour.
Thank you.
Hi Edgaras,
I am looking for a help for the following scenario :
My scenario is I need to capture a color image with Azure Kinect DK and that image has to be downloaded as a PNG or JPG format on my pc using Unity.
Can you please share a few reference for this scenario implementation.
Thank you.
Hello,
It should be something like that: imageTex = kinectManager.GetColorImageTex(0);
All the rest – saving texture to some path.
Hello, I need a similar project but with different requirements. Can you help me with the development?
Certainly.