• Wed, 03 Mar 2021 22:43:08 +0000

    Instagram Takes on Clubhouse, Launches ‘Live Rooms’

    Few companies are faster to react to new competition than Facebook and that continues to be the case with the announcement of “Live Rooms” for Instagram, a new feature that is a direct answer to Clubhouse.

    The new feature will allow users to “go live” on Instagram with up to three other people. Up to this point, Instagram has only supported live streaming with one other person on the stream.

    “We hope that doubling up on Live will open up more creative opportunities,” Facebook writes on a blog announcing the additional feature. “Start a talk show, host a jam session or co-create with other artists, host more engaging Q&As or tutorials with your following, or just hang out with more of your friends.”

    Live Rooms also build in a monetization option that will allow businesses to earn money. Last year, Instagram announced that it would allow viewers to purchase badges for their favorite content creators on the platform and that feature is also being included with Live Rooms.

    “With Live Rooms, viewers can buy badges for the hosts and use other interactive features like Shopping and Live Fundraisers,” the company writes. “We’re also exploring more interactive tools such as offering moderator controls and audio features that will be available in the coming months.”

    To start a Live Room, swipe left and pick the Live camera option, then add a title and tap the Rooms icon to add guests. YOu can see people who have requested to go live and you can also search for guests to add. If you are the host of a Live Room, you’ll remain at the top of the screen when you add guests, and as the broadcaster, you can add guests all at once or one by one.

    Clubhouse started to become popular late last year and has seen considerable success due to it’s ability to bring people together during the COVID-19 pandemic. It has been argued that the app is able to change social media, and is growing in popularity.

    Live Rooms is just the latest feature added to the Instagram platform specifically designed to tackle new competition. Last August, the company added Reels to compete directly with TikTok just a month after it tested a feature to compete with GoFundMe. Perhaps its greatest success was the addition of Instagram Stories which was a feature it added to compete with Snapchat. In 2018, its offering was twice as popular as the service it was designed to compete against.

    (via Tech Crunch)

  • Wed, 03 Mar 2021 22:04:13 +0000

    Street Candy Launches MTN100 35mm B&W ‘Motion Picture’ Film

    Street Candy has announced the new 35mm MTN400 film that is produced by the same manufacturer who was a major player in black and white motion picture film production. Street Candy says it added this film to its catalog due to its “gorgeous cinematic look.”

    MTN100 is a panchromatic film for both outdoor and indoor photography and can be processed as a normal black and white film negative or, because it is a motion picture film, with a reversal kit to produce direct positives. Street Candy states that photographers can expect beautiful contrast, fine grain, and rich details in the 36-exposure roll of ISO 100 film.

    Coming from a German manufacturer with a century long tradition in motion picture film, MTN100 will bring a unique character and elegance to your still black and white images.

    Below are a few examples of images captured using the MTN100 film provided by Vincent Moschetti, Street Candy Film’s founder:

    Street Candy originally launched with its first film called ATM400. Moschetti tells PetaPixel that it was a security surveillance film used in ATMs and other surveillance cameras in the past and has since been repurposed to 35mm film cameras.

    “Our goal is to support the film manufacturers still running production lines and keep a maximum of emulsions alive,” Mochetti says. “We also care deeply for the environment and have moved away from single-use plastic canisters and since 2020 we pack all our film in recycled cardboard boxes.”

    Street Candy’s MTN100 film can be pre-ordered directly from the company’s website in as small as a two-pack for 20.98 Euro (~$25.30) or as many as a 10 roll bundle for 104.90 Euro (~$126.55). The company says that it has less than 250 rolls remaining from its first batch due to “incredible support” from a pre-launch campaign and therefore supplies are limited for this run.

    Eventually, the MTN100 film will be available at Nations Photo in France and Analogue Wonderland in the United Kingdom and should ship by the end of March. The company says that if it does sell out of this run (which is fully expected), it is already working on producing the next batch.

  • Wed, 03 Mar 2021 21:24:49 +0000

    New Phantom TMX 7510 Camera Can Record An Insane 1,750,000 FPS

    Vision Research, the company behind the Phantom series of high-speed cameras, has announced the Phantom TMX 7510 and 6410 which are its first to utilize backside-illuminated sensors. The TMX 7510’s 75-gigapixel throughput can provide an incredible 1,750,000 frames per second.

    The TMX 7510 achieves 76,000 frames per second at full 1-Megapixel resolution of 1280 x 800, over 300,000 frames per second at 1280 x 192 resolution, and over 770,000 frames per second at smaller resolutions — But it gets even faster.

    “With the export-controlled FAST option, TMX 7510 reaches a blistering 1.75 Million frames per second and 95 nanosecond minimum exposure time, eliminating motion blur,” the company says. “This enables precise motion analysis for extremely fast applications.”

    To work at these high frame rates, the cameras are extremely light-sensitive with a native ISO of 40,000 for monochrome sensors and 12,500 for color sensors. The backside-illuminated (BSI) nature of the sensor allows for greater sensitivity by placing the photodiode closer to the source, which allows for a smaller pixel size of 18.5 microns that Vision Research says allows for more imaging definition.

    The TMX 7510 can reach that 1.75 million frames per second mark at a resolution of 1280 x 32. Below is a full chart of performance.

    As seen above, the camera can capture in “standard” or “binned” modes. The TMX Series offers 2 x 2 binning for high frame rates to provide users field-of-view flexibility. For example, the TMX 7510 can achieve over 617,000 frames per second at a resolution of 1280 x 94, or a resolution of 640 x 192 in binned mode.

    Below are a few examples of what is visible at these incredibly fast frame rates:

    The TMX 6410 has the same BSI sensor found on the 7510 and has similar features but at a lower maximum framerate. While the TMX 7510 can capture 76,000 frames per second at its 1 MP resolution, the 6410 can do the same resolution at 66,000 frames per second. Additionally, the TMX 6410 has a maximum framerate of 1,500,000 per second compared to the 7510’s 1,750,000 per second.

    The standard lens mount for both of the TMX-series cameras is Nikon F-Mount and includes aperture support for Nikon G lenses, but Canon EF, C-mount, and M42-mount options are also available. No pricing or date of availability for either camera was mentioned at the time of publication.

    (via DPReview)

  • Wed, 03 Mar 2021 20:51:24 +0000

    ‘Tokkingheads’ Lets You Animate Portraits Using Artificial Intelligence

    Rosebud.Ai, a company that wants to “disrupt media creation,” has created a service called TokkingHeads that will take any still image and turn it into a moving, talking avatar using artificial intelligence.

    Tokkingheads advertises its service as being able to instantly animate a portrait photo with text, speech, or existing puppet animation from its library. The results are mixed, with some that could pass as close to the genuine article while others struggle with head shape and proportion.

    The service isn’t limited to photos of actual people and can animate still images of drawings or renders.

    The service is free, but a subscription account allows you to generate high-resolution videos (which can also be accessed for free they are watermarked).

    Tokkingheads, and Rosebud.AI, may appear to be less sophisticated than other AI-generated people such as Hour One, but it is also starting with a lot less when it comes to input data. Hour One is able to make its AI talking heads a lot more realistic both in how they look and how they talk because it uses a nearly full-scan of a subject’s head as well as a set of words spoken by a person in order to generate a realistic-sounding voice. Rosebud.AI only has a still image and has to generate motion and sound from scratch or port in existing audio from an unrelated source.

    The result can give a significant “uncanny valley” effect, but when it is considered how little the company has to start with, is still impressive. The idea of “Deep Nostalgia,” which brings to life old photos, is also possible and Tokkingheads can be used to create similar results.

    Tokkingheads is just one iteration of what Rosebud.AI has created. Its other technology that it calls Generative Photos can instantly change the face structure, hair color, age, facial expression, and more using AI. The company positions this technology as a way to adapt any photo to work perfectly with a brand or e-commerce store and adjust stock photos to better differentiate from competitors.

    While many can still spot fakes — even in the very good examples of AI face manipulation — artificial intelligence is getting extremely advanced and heavily commoditized. One thing is for sure: it’s only going to get better and easier to use going forward.

    You can see more of the Rosebud.AI Tokkingheads artificial people on the Generative.Photos Instagram that’s run by the company.

    (via Laughing Squid)

  • Wed, 03 Mar 2021 18:25:47 +0000

    How to Photograph the Milky Way Core Season

    Milky Way core season is right around the corner. I wrote this quickie guide on how to photograph our galaxy for anyone interested in going out to do some astrophotography this season!

    So first a little bit about myself. I mean who would take advice from some random stranger on the Internets. I am Dan Stein, I have been taking pics of the stars for over 8 years now, and I love talking about astrophotography and helping others when it comes to their own star shots. I took my first nightscape back in college, and now I travel and take pics away from light pollution in my free time.


    The first thing you will need to go is find a dark site location. Use this map to find a place near you which offers pretty dark skies, but you don’t have to drive for hours to get to. I live in the suburbs of NJ and while it is difficult to find dark sites, there are a few within only an hour or so drive each way.

    The Light Pollution Map at DarkSkyFinder.com

    Next up is timing. You will need to take into account the current Moon phase which will determine if there will be a large amount of light reflecting off the Moon when you shoot. The fuller the Moon, the more it will wash out the stars. By the next new Moon on March 13, the Core will be out in all of its glory.

    Another factor of timing is rise and set times of the Milky Way itself. I took this shot in June, for instance:

    In the early season like March, the Milky Way will be rising during the early hours of the night around 3-4 AM. The core will start to rise earlier and earlier with mid-july being one of the best times to shoot the core. Then it will be visible after sunset and it will be seen throughout the night.

    Towards September, the core will set earlier and earlier and be visible for around less than an hour before dipping below the horizon. No need to shoot just the Milky Way core though, the winter Milky Way is beautiful in its own regard, but I figured with Core season around the corner I would try to keep it focused to the Milk.

    Here are some core timing examples:

    Shot in March Shot in June Shot in September

    The last thing to consider before heading out into the field is weather. A clear sky is vital in seeing the stars. Even the thinnest clouds can wash them out. But, clouds are still cool to work with — they add another atmospheric element near the horizon. You just want to make sure the sky is mostly clear before heading out.


    Initial Settings

    Onto the camera goods. Plop your camera on a tripod and make sure it is secure. You will need to place your camera in Manual (M) mode. Set it to shoot in RAW, and use the daylight white balance setting. Manual will allow for full control of the camera, RAW will allow for better post processing (as you all probs know), and a daylight white balance is used as that is the color temperature that most represents a “true” night sky so it is a good starting point. RAW mode will allow that to be adjusted effortlessly in post production if the warmer tones aren’t your style.

    Next, put your camera in the live view mode (if using a DSLR). This will blackout the optical viewfinder but make it easier to use the screen to compose your shots. Live view is also crucial for focusing which is the next step. If you have a mirrorless camera, your camera is “always” in live view.

    Crank up the ISO to about 1600-6400 depending on how dark the sky is, and open the aperture wayyyy up to the maximum possible for your lens. We are trying to let in as much light as possible since everything around you is so dim. I won’t get to deep into the technicals of what is the best ISO to use for your camera and ISO invariance. I am trying to keep it simple here.


    Put your lens in manual focus mode, and then try to vaguely focus by moving the focus ring to the infinity marking on your lens. If it does not have that marking, try moving the focus very slowly until stars start to make their way visible in live view.

    Next, find a bright star and use your tripod head to place it in the center of the live view screen. Press the little magnifying glass button on your camera which will enable a digital zoom feature on your screen. Press the button until it as big as possible on your screen. At this stage, gently rack the focus back and fourth until that star is as sharp and pinpoint as possible. This can be tricky, so be patient and if you can’t find a star to work with, try a bright distant object.

    Shutter Speed

    After focusing, you will need to calculate your shutter speed. We know this Earth isn’t flat so it rotates and thus too long of a shutter speed will blur the stars. But, too short of a shutter speed and the shot will come out too dim. To calculate this, we use the NPF Rule.

    You may heard of the 500 rule, but as digital sensors progressed that rule became less relevant. The NPF is super precise and takes into account the many parameters of your specific setup. This is where having a wider lens will help, something between 14mm-24mm is nice for a widefield view. Remember to take into account your crop factor if you are not shooting with a full frame camera. Something like a 70-200mm zoom is great for events or sports, but it will make your shutter speeds super short because it does not take much time for a star to move across the frame since everything is “bigger.”

    Techniques & Composing

    Once you dial in your shutter speed, start shooting! If you find the shot came out too dark, increase the ISO. If it was too bright, decrease it. Compose your shot by moving the camera on the tripod and when you are happy with a composition, turn on the camera’s self timer release mode, then let it rip. The self timer will ensure there is no shakiness interrupting the shot after the shutter button is a pressed. A remote trigger will help with this too, but it is not necessary at first.

    Another thing to consider when out shooting is image stacking. If you find your shots are super noisy, you can take 10-20 shots of the same composition all back to back and merge them using software which can average out the noise and thus mitigate it. I do recommend first trying out astrophotography with single images to get the grips with imaging the stars, then factoring in stacking later on. Again, something I may think about demonstrating in a future tutorial.


    Post processing can be done in pretty much any RAW imaging software. Lightroom, Photoshop, Capture One, etc… Start with the sliders. Play around with the exposure, contrast, highlights, shadows, white balance, etc, and see what works for you. If you shot a stack sequence, you can process that too. I think there is a lot of creative freedom to be had with post processing and you can really convey your emotions of how you feel about your shots with these powerful tools.

    Final Thoughts

    Remember, everything here takes a lot of practice. If you find yourself getting discouraged or frustrated because a shot isn’t working out or something is difficult, that is normal! That is a sign of learning and progress and shows you are taking steps to accomplish something new and challenge your skills. At the end of the day, you are killing it!

    I hope this was helpful. If you like posts like this, I may consider starting a blog/tutorials category on my website. Shoot me a comment below if you want anything clarified, and I am happy to help when I have a hot second. Also feel free to check out my Instagram at @danieljstein, which is where I usually post my work and interact with the photography community as well.

    About the author: Dan Stein is a passionate nightscape photographer working full-time in New Jersey in his family’s commercial plumbing contracting business. The opinions expressed in this article are solely those of the author. You can find more of Stein’s work on his website and Instagram.

Powered by SmugMug Owner Log In