![]() A few years down the line, the image will be so small that it is not possible to view it, as resolutions and pixel density goes up with new monitors). If not, we are just stuck with an oversold feature that in the end has little value (sure, OK quality for Google Maps etc. ![]() Seen one etc, but occasionally could work perhaps.įor sure! If we cannot get DJI to produce a RAW output from the get-go, then at least the aim should be to find a consistent way to stitch, that others can apply too. I would say though, that these photosphere things are. Especially without using software that is not common to have (unlike Lightroom and Photoshop).īe interested to hear if you get a solution. I was a bit shocked that nobody has come out with a good tutorial yet for this that produces a raw composite after about a year of being on the market. Would be really interested to know your workflow for this specific drone (the other models like mini 2 or mini 1 do not have the same angle so it is hard to say that you can just apply the same workflow). With PTGui I still have to select so many manual reference points between images. They are morphed in a completely different way. Ĭool mate! How do you go about it though? PTGui? Lightroom? Photoshop? I captured a new one yesterday with skies and detail in all shots, and no matter which option and combination of options I choose in Lightroom or Photoshop, the results are nowhere near the output of the drone JPG. See some of my 360 sphere, you will notice the sky is the same on several. When I have a clear sky without reference points, I am using a generic pano sky and stich it with Photoshop. I don't know, but a JPEG for me goes right into the trash (also bothers me that I cannot turn of JPEG images and ONLY save RAW files). Is this a part of the metadata somehow or.? It is kinda mind boggling that I cannot seem to find any software that can figure this out, or that the fly app cannot produce a DNG version instead, so we don't have to go through such endless hoops just to get a final image that is actually usable. Is there maybe another approach I can use in these situations? I mean, the DJI fly app clearly knows which image belongs where, and what position it should be stitched to. there is no way for me to say what point of the image is related to another one. In PTGUI you can define points between two images, although due to the angles of the shots and the fact that the sky was clear etc. Trying to stitch the images together myself, 10 images were unusable to both Adobe Lightroom and PTGUI, as they were of a clear sky without any reference points. I tried taking a phosphere the other day and the output image from the drone was good, albeit it is a JPEG which means I cannot do any meaningful edits to it.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |