Stock Photography and Stock Footage
Man listening to MP3 Player, hand reaching towards camera
ap | Asia Images RF | Royalty Free
years, arms outstretched, asia, asian ethnicity, black hair, chinese ethnicity, colour image, copy space, day, delight, enjoyment, focus on background, free time, half length, happiness, holding, indoor, ipod, joy, joyfulness, leisure, leisured, leisurely, listening, looking at camera, man, mid adult, mp3 player, music, one person, pleasure, portraits, reaching, recreation, satisfaction, selective focus, shirt, smiling, spare time, standing, studio shot, time off, toothy smile, yellow, stock image, images, royalty free photo, stock photos, stock photograph, stock photographs, picture, pictures, graphic, graphics, fine art prints, print, poster, posters, mural, wall murals, ap
It was , and Jordan Belamire was excited to experience QuiVr, a new fantastical virtual reality game, for the first time. With her husband and brother-in-law looking on, she put on a VR headset and became immersed in a snowy landscape. Represented by a disembodied set of floating hands along with a quiver, bow, and hood, Belamire was now tasked with taking up her weapons to fight mesmerizing hordes of glowing monsters.
But her excitement quickly turned sour. Upon entering online multiplayer mode and using voice chat, another player in the virtual world began to make rubbing, grabbing, and pinching gestures at her avatar. Despite her protests, this behavior continued until Belamire took the headset off and quit the game.
Right now, one of the most common forms of governance in virtual worlds is a reactive and punitive form of moderation.
My colleagues and I analyzed responses to Belamire’s subsequent account of her “first virtual reality groping” and observed a clear lack of consensus around harmful behavior in virtual spaces. Though many expressed disgust at this player’s actions and empathized with Belamire’s description of her experience as “real” and “violating,” other respondents were less sympathetic—after all, they argued, no physical contact occurred, and she always had the option to exit the game.
Incidents of unwanted sexual interactions are by no means rare in existing social VR spaces and other virtual worlds, and plenty of other troubling virtual behaviors (like the theft of virtual items) have become all too common. All these incidents leave us uncertain about where “virtual” ends and “reality” begins, challenging us to figure out how to avoid importing real-world problems into the virtual world and how to govern when injustice happens in the digital realm.
Now, with Facebook predicting the coming metaverse and the proposal to move our work and social interactions into VR, the importance of dealing with harmful behaviors in these spaces is drawn even more sharply into focus. Researchers and designers of virtual worlds are increasingly setting their sights on more proactive methods of virtual governance that not only deal with acts like virtual groping once they occur, but discourage such acts in the first place while encouraging more positive behaviors too.
These designers are not starting entirely from scratch. Multiplayer digital gaming—which has a long history of managing large and sometimes toxic communities—offers a wealth of ideas that are key to understanding what it means to cultivate responsible and thriving VR spaces through proactive means. By showing us how we can harness the power of virtual communities and implement inclusive design practices, multiplayer games help pave the way for a better future in VR.
The laws of the real world—at least in their current state—are not well-placed to solve the real wrongs that occur in fast-paced digital environments. My own research on ethics and multiplayer games revealed that players can be resistant to “outside interference” in virtual affairs. And there are practical problems, too: In fluid, globalized online communities, it’s difficult to know how to adequately identify suspects and determine jurisdiction.
And certainly, technology can’t solve all of our problems. As researchers, designers and critics pointed out at the Game Developers Conference, combatting harassment in virtual worlds requires deeper structural changes across both our physical and digital lives. But if doing nothing is not an option, and if existing real-world laws can be inappropriate or ineffective, in the meantime we must turn to technology-based tools to proactively manage VR communities.
Right now, one of the most common forms of governance in virtual worlds is a reactive and punitive form of moderation based on reporting users who may then be warned, suspended, or banned. Given the sheer size of virtual communities, these processes are often automated: for instance, an AI might process reports and implement the removal of users or content, or removals may occur after a certain number of reports against a particular user are received.
While these kinds of responses can be effective in the short-term and demonstrate clear consequences for disruptive behavior, they have distinct problems. Because they are reactive, they do little to prevent problematic behaviors or support and empower marginalized users. Automation is helpful in managing huge amounts of users and material, but it also leads to false positives and negatives, all while raising further concerns surrounding bias, privacy, and surveillance.
In fluid, globalized online communities, it’s difficult to know how to adequately identify suspects and determine jurisdiction.
As an alternative, some multiplayer games have experimented with democratic self-governance. Perhaps most famously, Riot Games implemented a Tribunal system that allowed players to review reports against other players and vote on their punishments in the multiplayer game League of Legends. A lack of accuracy and efficiency saw it shelved a few years later, but a similar system known as Overwatch continues to live on in Valve’s CS:GO and Dota 2. Forms of self-governance in VR are also on Facebook’s radar: A recent paper by researchers working with Oculus VR suggests that the company is interested in promoting community-driven moderation initiatives across individual VR applications as a “potential remedy” to the challenges of top-down governance.
These kinds of systems are valuable because they allow virtual citizens to play a role in the governance of their own societies. However, co-opting members of the community to do difficult, time-consuming, and emotionally laborious moderation work for free is not exactly an ethical business model. And if—or when—toxic hate groups flourish, it is difficult to pinpoint who should be responsible for dealing with them.
One way of addressing these obstacles is to hire community managers (CMs). Commonly employed by gaming and social VR companies to manage virtual communities, CMs are visible people who can help facilitate more proactive and democratic decisionmaking processes while keeping both users and developers of VR accountable. CMs can remind players of codes of conduct and can sometimes warn, suspend, or ban users; they can also bring player concerns back to the development team. CMs may have a place in the metaverse too, but only if we figure out how to treat them properly.
Often the first port of call for gaming communities, CMs are the (virtually) smiling faces that welcome new players, generate hype around a game, and convey messages between developers and players. But it would be a mistake to think their role is merely to market a product: As they guide users through a membership life cycle from wide-eyed visitors to respected elders, they also help set good examples for positive behavior, reinforce codes of conduct, and set the right tone for a community—the same way a community elder might do in the physical world.
Assigning community managers in VR spaces adds empathy and the all-important “human touch” to the governance process. By boosting a sense of belonging, responsibility, and human presence, CMs can—at least in theory—help minimize problematic behaviors brought about through anonymity and automation.
Unfortunately, CMs are currently incredibly undervalued, undertrained, and underpaid, and frequently face a barrage of death threats, rape threats, and other forms of abuse from the users they are hired to care for. If community managers are to play a role in governing the virtual worlds of metaverse, we must ensure that this essential work is better supported and compensated. An overworked and uninformed CM is likely to do (and come to) more harm than good.
Although best practices are still being worked out, the Fair Play Alliance—a coalition of gaming companies that aims to foster healthy gaming communities—has shared a framework for disruption and harms in gaming that offers advice on managing communities alongside the development of penalty and reporting systems. Combined with adequate pay, evidence-informed training, and in-house emotional support, these kinds of resources will help put CMs in a much better position to serve virtual communities sustainably.
VR spaces are, at their core, designed spaces. As such, the mechanics of the digital environment are also central to governance. Over a decade ago, Nick Yee, social scientist and cofounder of the gaming research company Quantic Foundry, argued that a multiplayer game’s framework of rules and coded design—its “social architecture”—can shape the interactions we have in virtual worlds. And if we can design virtual worlds to enable antagonistic interactions, we can design them to facilitate prosocial ones too.
Such design choices can be quite subtle and unexpected. Yee noted that in the multiplayer game Everquest, players who died in the game lost their loot and had to travel back to the site of their demise to retrieve it. This design feature helped facilitate altruistic behaviors, Yee suggested, as players had to ask each other for help in retrieving their lost items. In less playful VR spaces, one way of channelling this effect (along with more protective efforts) could involve encouraging users to ask others for help with virtual tasks such as onboarding, moving through or altering the environment, or acquiring avatar flair, giving users opportunities to actualize their more positive values.
To some extent, we’ve already started to see how the unique affordances of VR can be used to benefit users through design in other ways. In response to Belamire’s account, the QuiVr developers implemented a power gesture: a hand motion that, much like a “superpower”, turns on a personal bubble that causes offending avatars in the immediate vicinity to be muted and disappear from a user’s view (and vice versa) until the user chooses to turn it off. While largely symbolic, this gesture shows how crucial it is for developers to take these issues seriously. Having control over one’s personal space is important in the virtually embodied realm of VR, and simple, intuitive hand gestures that allow us to immediately control who or what we see could empower users in any VR space in a way that is simply impossible in the physical world.
To be sure, some proactive design approaches that work in games may not work in more serious VR spaces. For instance, encouraging players to endorse each other for good teamwork may reduce toxicity in games like Overwatch, but endorsing your colleagues in a VR work environment may feel like kindergarten at best or insidiously dystopian at worst (think MeowMeowBeenz).
Yet there’s still room for unconventional approaches to be further explored in VR: what if, for instance, we sentenced offending avatars to perform virtual community service, or undergo virtual mentoring or counseling programs? The idea of issuing virtual-world responses inspired by real-world penalties may sound absurd, but such approaches are not totally unheard of. The gaming platform Steam publicly labels the profiles of players who have been banned for cheating, and in the president of Daybreak Game Company invited cheaters to publicly apologize for their actions in order to be un-banned from the game H1Z1. As in the real world, virtual public shaming and incarceration present particular ethical concerns. But with careful scrutiny, more rehabilitative and restorative responses to virtual transgressions could have a meaningful place in the governance of the metaverse too.
As we explore how to govern users in VR, it is necessary to address the vital question of who is likely to be left out of these spaces. Existing biases have a nasty way of sneaking into our technological designs, resulting in virtual worlds that are hostile or inaccessible to particular groups. The physical requirements of VR can make it difficult for people with certain disabilities (such as visual impairments) to participate. Avatars are an anchor through which a person connects with and navigates a virtual space, but research in gaming reveals they are often designed in ways that misrepresent and exclude people of color. And as Belamire’s experience shows us, interactions in VR can be particularly harmful to women in ways that discourage them from participating at all.
Those who are left out of virtual worlds are often not so coincidentally underrepresented in virtual-world research and design teams as well. It is therefore imperative for us to acknowledge the barriers to inclusion that people face and promote diverse voices early in the development process. Positively, there have been increasing efforts to build more inclusive games that are also being translated to VR—The AbleGamers Charity, for instance, works with the gaming industry to make games more accessible to people with disabilities. These kinds of focused efforts are essential in helping us to avoid the reinforcement of existing divides and inequities in the VR metaverse.
Despite these challenges, it is imperative that we embrace—and demand a commitment to—our shared social responsibility to nurture flourishing and vibrant VR communities. A balanced approach to restrictions and penalties has an important role to play, as may real-world law. But we must be wary of relying too heavily on automated moderation, suspensions, and bans, which make up only one part of building healthy virtual worlds. “Out of sight, out of mind” is simply not a good enough maxim to live our virtual lives by. As our rich history of managing communities in multiplayer games shows, virtual governance can (and must) be much more than that.
By continuing to draw on our rich experiences in multiplayer gaming to explore the community-driven, inclusive, and empowering potentials of VR, we can help build digital communities we actually want to be a part of. The quality of our virtually-real lives depends on it.
More Great WIRED Stories
- Steve morris racing engine
- Palo alto tac recommended version
- Rv campgrounds near lewisburg wv
- Google earth engine sentinel 2
Alec Baldwin Was Rehearsing Pointing Gun at Camera, Affidavit Says
The director of “Rust” gave the authorities the most detailed account yet of how the actor fatally shot the film’s cinematographer.
ALBUQUERQUE — Alec Baldwin was rehearsing a scene that involved pointing a revolver “towards the camera lens” when the gun — which the crew had been told did not contain live rounds — suddenly went off and killed the cinematographer, according to the film’s director, who was quoted in an affidavit released Sunday night.
The film’s director, Joel Souza, described hearing what “sounded like a whip and then loud pop.”
The account by Mr. Souza explained why Mr. Baldwin had been pointing the gun at the cinematographer, Halyna Hutchins. But it did not answer the question of how a gun that was not supposed to contain live ammunition wound up killing her.
The director, who was wounded in the shooting, told investigators that he had believed that the gun was safe and that it had been described as a “cold gun” in firearm safety announcements. He said that guns on the film’s set were typically checked by the film’s armorer, Hannah Gutierrez-Reed, and then checked again by Dave Halls, the assistant director, who would hand them to the actors.
On film sets, the order of who handles a weapon typically involves a precise sequence, several armorers in the industry said. But actors had been handed guns on the set by both Mr. Halls and Ms. Gutierrez-Reed, according to a producer of “Rust” who asked not to be named because of the ongoing investigation.
On Thursday, after preparing for the scene in a set of a church, Mr. Souza told investigators, there was a lunch break, and the crew was taken by shuttle elsewhere for food. He said that they returned to the set after lunch but that he was “not sure if the firearm was checked again.”
The new details, which emerged when the Santa Fe County Sheriff’s Office released the affidavit used to obtain a search warrant, provided the fullest account yet of the deadly shooting, which took place Thursday afternoon on a set outside Santa Fe.
Mr. Baldwin had been sitting in a wooden church pew, rehearsing a scene that involved “cross drawing” a revolver and pointing it at the camera lens, Mr. Souza said, according to the affidavit. Mr. Souza said that he had been standing beside Ms. Hutchins “viewing the camera angle.”
Mr. Souza saw Ms. Hutchins grabbing her midsection and starting to stumble backward. Then he noticed he was bleeding from his shoulder.
The details, woven together by Detective Joel Cano in an application for a search warrant to seize everything from camera memory cards to bone fragments and firearm discharge, provide a chilling account of the fatal shooting on a production set that had been beset by accidental gun discharges and labor disputes between producers and crew members. (The warrant was granted.)
“Upon making contact I did observe a visible injury to his right shoulder,” Detective Cano said in the affidavit, describing how he had interviewed Mr. Souza on Friday afternoon, after the director had been treated for his injury. Ms. Hutchins, who sustained a gunshot wound to the chest area, had already been pronounced dead on Thursday at University of New Mexico Hospital in Albuquerque.
“Joel stated there should never be live rounds whatsoever, near or around the scene,” Detective Cano wrote in the affidavit.
Mr. Souza was grappling with delays the day of the shooting, after about six members of the camera crew had quit over late pay and safety conditions, the affidavit said. Another crew had quickly been hired, but the production was off to a late start because of the labor problems. Mr. Souza said only one camera was available for recording before the shooting.
Asked about “the employees’ behavior,” Mr. Souza told investigators that “everyone was getting along” and that there had been “no altercations” to his knowledge.
The affidavit also includes notes from an interview with Reid Russell, a cameraman who was standing near Ms. Hutchins and Mr. Souza when the gun discharged.
Mr. Russell told the detective that after returning to the set from lunch, he had stepped outside for about five minutes; when he returned, according to the affidavit, Mr. Baldwin, Ms. Hutchins and Mr. Souza were setting up the scene and were already “in possession of the firearm.” Mr. Russell said he was not sure if the firearm had been inspected because he had been absent for those five minutes.
According to the affidavit, Mr. Halls grabbed the revolver from a gray, two-tiered tray set up by Ms. Gutierrez-Reed. Mr. Halls handed the gun to Mr. Baldwin and shouted, “cold gun,” which on a film set typically refers to an unloaded firearm.
While setting up the scene, the crew had to reposition the camera because there was a shadow. Mr. Russell told the detective that Mr. Baldwin was explaining how he was going to draw the gun, pulling it out from the holster, when the firearm discharged.
Mr. Russell said that Mr. Baldwin had been “very careful” with the firearm; during an earlier scene, Mr. Russell said, Mr. Baldwin had tried to ensure safety on set, making sure that a child wasn’t near him when he was discharging the gun. Asked about how members of the production team were behaving as they set up the scene, he said “everyone seemed to be getting along.”
Mr. Souza, the director, told the detective that because the crew had been setting up the scene when the gun discharged, the incident had not been filmed.
After the firearm was discharged, Mr. Russell told the detective he “remembered Joel having blood on his person, and Ms. Hutchins speaking and saying she couldn’t feel her legs.”
In an Instagram post on Monday morning, Mr. Baldwin’s wife, Hilaria Baldwin, expressed support for Ms. Hutchins’s family as well as for Mr. Baldwin, writing that it was “impossible to express the shock and heartache of such a tragic accident.”
Graham Bowley contributed reporting.
The Ghost Hand Illusion
STARE at the tiny, central black fixation spot on the white cross in a. After 30 seconds, transfer your gaze to a neutral gray background. You should see a dark—almost black—cross fading in and out. It is especially pronounced if you blink your eyes to revive the image to slow down the fading.
This effect is called a negative afterimage because the persistent ghost of the cross is the opposite of what you were looking at—it is dark instead of light. When you fixated on the white cross, you “fatigued” the retinal light receptors by bleaching out the cone pigments. So when you look at neutral gray, the region corresponding to where the white cross had been fires less vigorously than the surrounding area, and the net result is that it is seen as a dark cross.
Why does the cross fade? Partly because the fatigued receptors recover slowly as the bleached pigment regenerates. In contrast, with real images our eyes are in constant motion—images sail and jerk across the retina as we scan rooms, roads, texts or faces to identify novel or important bits. This continual movement prevents adaptation or fatigue because new patterns are constantly on any retinal area. With intense focus, you can eliminate all voluntary movements, and you should notice certain objects slowly fade away, as in b (termed the Troxler effect or Troxler fading). This fading is intermittent because your eyes never completely stop moving. Microscopic involuntary trembling characterizes even the steadiest fixation. This “physiological nystagmus” allows the brain’s edge-detecting neurons to avoid being fatigued, even during fixation, by providing moment-to-moment refreshing. But an afterimage, unlike a real image, remains stuck to the retina so the neurons are not refreshed and fatigue quickly kicks in.
All of what we have discussed so far is the conventional story. But there is much more to afterimages than meets the eye, as shown by the late Richard L. Gregory of the University of Bristol in England, who was the world’s preeminent perceptual psychologist. His book Eye and Brain launched many a student (including both of us) on a career in visual psychology and neurophysiology. The word “genius” is rarely used these days, but if anyone deserves the title, it would be Gregory.
Gregory studied positive afterimages because they fade more slowly and are more intense with more clearly defined borders, making them easier to study. In collaboration with Elizabeth L. Seckel of our laboratory, we have confirmed the results of many little-known experiments Gregory did on afterimages in the late s. The reader might wish to try them out today.
A Shot in the Dark
Have a friend aim a flash camera at you in a dimly lit room while you gaze at a tiny, luminous dot affixed to the center of the flash. When he “takes your picture,” you will get a positive afterimage. The persistent firing of photoreceptors makes you see a bright white disk long after the actual flash has gone.
Because the afterimage is glued to the retina, if you move your eyes around the room, the afterimage moves along with them. Now, while you have an afterimage, look at surfaces at different distances. The afterimage will appear on each surface as you fixate on it, and, amazingly, its apparent size will expand or shrink depending on how far or close the surface of regard is. What fun! Hold a piece of paper at arm’s length, move it toward your nose and watch the afterimage on it change in apparent size from a Ping-Pong ball to a pea. Cast your view back to a distant wall, and instantly the afterimage appears beach-ball-sized.
Why does this effect occur? Consider real objects. For example, if a friend standing five feet from you starts walking away, her retinal image size shrinks as she leaves. At 10 feet, it is half as tall (simple geometry). But of course, you do not see her shrinking—only as moving farther away. Perceived size varies directly with perceived distance (known as Emmert’s law). And in judging distance, the brain weighs information from motion, stereo, perspective, vergence angle, and so forth and applies the necessary “corrections”—a process called size constancy.
Usually this process is adaptive in that it allows you to perceive the object as it really is: constant in size regardless of distance and retinal image size. But in the case of an afterimage, the processing backfires. The afterimage does not change size on the retina with changes in viewing distance, but your brain still interprets it as doing so. Thus, when the afterimage is superposed on a far wall, your brain expects the retinal image to have shrunk from the size it would have been on a near wall. Your brain therefore expands the apparent size to compensate. It is important to realize that all this occurs on a kind of autopilot. There is no conscious reasoning or decision making such as: “If the object is far, it must have a small image; therefore, object size must be” That type of cogitation would be much too time-consuming to be effective. Why this entire process results in the image actually looking large rather than simply knowing it is large is a $64, philosophical question called the riddle of qualia. (We will stay away from this question in this column, even though we personally believe size constancy might one day help solve the riddle more readily than asking, “Why is red red?”)
Emmert’s law also works in complete darkness. This is because when you look at an imaginary object at different distances, the angle between the two eyes’ lines of sight (vergence angle) changes, and the brain measures this change in eye position. So the afterimage shrinks and expands in darkness, depending on how far away you gaze.
Next try the following experiment. Generate an afterimage with another flash. Then, in darkness, stand perfectly upright and move your head forward and back from the (invisible) wall in front of you. When you stick your neck out, you will find that the afterimage shrinks because the brain “assumes” it is a real object expanding and therefore applies a (false) correction. Perhaps signals from the neck muscles are sent to the visual centers to zoom the perceived size. Alternatively, when the motor-command centers in the brain send commands to neck muscles, they may send a kind of cc (as in e-mail) to the visual centers.
These facts about Emmert’s law are pretty straightforward, but the best is yet to come.
Affix a tiny, luminous spot on the center of your right palm and, in complete darkness, hold your hand out at arm’s length and look at the spot. Have a friend look over your shoulder, then take a flash aimed at your outstretched hand.
Now look straight head. You will see a vivid ghostly afterimage of your hand. Keep gazing forward so that the hand image is hovering in front of you—nothing surprising so far. But now move your real hand toward your nose, and you will get the impression that the hand image is shrinking. This miniaturization will happen even if there is an image in only one eye, so the source of distance information cannot be the vergence angle.
Gregory’s ingenious idea was that the proprioceptive information from muscle and joint sense in the arm must be going all the way to the brain’s size-perception centers; the messages do not have to originate in the eye muscles. The effect feels spooky, because you would expect your real hand image to grow as it approaches your nose, but (try it in a fully lit room) it actually shrinks because of proprioception, driven by Emmert’s law. The arm muscles are telling your brain that the glowing hand is approaching you, yet it appears to expand. So you are startled. Moreover, if you move the hand too close to yourself, the expansion of the ghost ceases. This result may occur because you do not usually bring or see your hand that close, so your size-constancy mechanisms are not “wired” for it. It might be equally interesting to affix a long dummy arm to artificially lengthen your arm to see what happens.
Here is another experiment with the same setup. Move your hand away from its afterimage so that the afterimage remains out in front, but the hand is not. If you are like most of us, you will see the afterimage suddenly starting to fragment, the so-called crumble effect reported in by P. Davies, then at the University of Aberdeen in Scotland. This breaking apart happens because the brain is confronted with a discrepancy between the visual location of the afterimage and the proprioceptive location of the arm. Abhorring discrepancies, the brain simply starts “shutting down” one image. It is easier to halt an evanescent, inherently unstable afterimage than to shut down muscle and joint sense from the arm. So the image starts to fade and fragment. (Our colleague Stuart Anstis of the University of California, San Diego, has pointed out to us that the effect also occurs for other body parts.)
Another surprising effect takes place if you hold your right hand out in front of you in complete darkness so that congruence is reestablished, and the afterimage of the hand once again robustly reappears. Now move your left hand in between your nose and outstretched right hand (and its afterimage). You would not normally expect anything to happen because, unlike a real glowing right hand, which would be occluded by the interposed left hand, the afterimage should not be occluded—it is still stuck on the retina and should now be seen “superimposed” on the (albeit invisible) left hand. Astonishingly, in at least some trials, the afterimage becomes “occluded,” just as a real hand would—as if the mere expectation is enough to make it fade.
Do these effects occur only with hands, or can they happen for the entire body? By using a suitable placement of the flash camera in front of you while you look down on your own body, it is possible to create an afterimage of your entire body. It helps to wear white clothes, so the afterimage is brighter. (We did this experiment in collaboration with Seckel.) If you now tilt your eyes and head up to look straight ahead, a ghostly apparition of your body will start floating upward away from your real body, creating a momentary feeling of instability. More surprisingly, when we tried the experiment on a patient with chronic intermittent bodily pain, the discrepancy seemed to alter the pain—sometimes increasing it momentarily but mostly reducing it. It remains to be seen if the effect is merely wish-fulfilling suggestibility or a real sensory phenomenon.
Using a powerful flashgun, the reader might wish to try other ingenious variations on the theme. What if you were to superpose the afterimage of the hand on your hand and wiggle your fingers? Have fun!
Camera towards hand reaching
Baby lying on blanket, reaching towards camera, focus on hands
|Photographer:||© lookphotos / és|
|Property Release:||not required|
|Print size:|| approx. × cm at dpi |
ideal for prints up to DIN A4
Prices for this image
(book, digital media etc.)
starting at $25
(electronic, outdoor etc.)
starting at $45
(calendar, greeting etc.)
starting at $
License packages for media independent usage (national or international)
starting at $
If you have a fixed budget and are wondering how to implement your project in the best possible way, we will be glad to help. Simply send us your usage area and your requested fee. Contact us today, we are happy to help you!
This image is part of a series
Search for similar imagesAddictionanxietyBabiesBabyBedBedsBlack Backgroundbust portraitChildhoodclose upClose-Range PhotographClose-Range PhotographsCloseupClose-UpCloseupsClose-UpsColorColor ImageColor pictureColorsColourColour picturecoloursColoursCuriosityDayDaytimeDependencyEuropeanEuropeansFearfrom the frontFront ViewFrontal ViewHalf Length PortraitHandHandsHead And ShouldersIndoorIndoor PhotoIndoor ShotIndoorsInnocenceinnocentinsideInsideInteriorInterior PhotoInterior ShotLooking At Cameralooking into the cameraNakedNudeOne PersonOutstretchedPersonReachrelaxationRelaxationSeitlingSelective focusSeriesSingle PersonUnclothedUndressed
All the ways to use your phone with one hand
This story has been updated. It was originally published on September 17,
Remember back in the era of the iPhone 4 and 5, when Apple’s phones seemed diminutive compared to their growing competitors? The company claims they did so to keep the phones usable with one hand. They eventually caved, though, and even today’s smaller phones (like the iPhone 13 mini and Pixel 4) are bigger than what we had back in the day.
There are, however, a few tricks and tools—many of them lesser-known—designed to help your short thumbs deal with those large screens.
Enable Reachability or One-Handed modes
You may be surprised to learn that your phone actually has a one-handed mode built right in, designed to make all those icons easier to reach. The iPhone’s version is called Reachability mode, and I find most people enable it by accident and think it’s some sort of glitch, rather than a useful feature.
That might be why Apple disabled the feature by default in iOS 12, but you can get it back by heading to Settings > Accessibility> Touch and toggling Reachability on. From there, you can either swipe down near the bottom of the screen (on iPhones without a Home button) or double-tap—not click, but tap—the Home button (on iPhones with this button) to shift your home screen icons down for easy access.
[Related: Smartphones arent designed for seniors, but these tweaks make them more accessible]
Many Android phones have their own one-handed mode. On Galaxy devices, youll find it in Settings > Advanced Features > One-Handed Mode. On Pixel phones, the path is a little longer: go to Settings, then System, Gestures, and finally One-handed mode. If you want something a bit different, Reachability Cursor is another popular tool, giving you a mouse-like cursor to reach faraway icons on the screen. Bottom Quick Settings is a good companion app too, allowing you to reach Androids Quick Settings panel from the bottom, rather than the top, of the screen.
Arrange your home screen for one-handed access
While the above tricks will get you pretty far, there are other things you can do to make reaching your apps even easier. A bit of home screen organization, for example, can go a long way—either on its own or in addition to the reachability tools we’ve already talked about.
Both iOS and Android allow you to rearrange your home screen apps by pressing and holding on the icons to move them around. Put your most-used apps in the bottom right corner of the screen (or bottom left, if youre a southpaw) and theyll be a lot easier to reach. Android even allows you to leave the unreachable portion of the screen empty, so all your apps are reachable. Apple forces you to arrange your iOS icons in a grid, but you can use a tool like Makeovr to create invisible icons, thus allowing you to push all your apps to an easily reachable corner while leaving the rest of the screen empty.
Android users have a few more tricks up their sleeve in this realm, too. Google has been adding advanced hand gestures since Android 10, and now you can swipe down anywhere on the screen to reveal the notifications drawer. You might also want to check out our list of Android home screen replacements, which may also provide some one-hand friendly layouts.
Shrink the keyboard
Typing on a keyboard with one thumb isn’t exactly fast, and if you’re tapping out more than a couple words, you’ll probably want to use both hands. But for quick texts like “on my way, one hand works well… as long as you can reach all the keys.
Many keyboards, on iOS and Android, have a one-handed mode that shifts the keyboard toward the right or left edge of the screen, so you can reach all the keys with a single thumb. On iOS, youll usually find this option by holding the globe icon and enabling one-handed mode (the keyboard icons with arrows pointing left or right). On Android, it can vary from keyboard to keyboard. Googles Gboard, for example, lets you switch to one-handed mode by holding down the comma key. It wont make typing on a phone enjoyable or anything, but itll at least save you from making so many typos.
Tweak your apps
Finally, your apps may have some one-handed features built-in. We obviously can’t detail every app here, but dig through the settings of your most-used apps and see what you can find.
[Related: Take control of your apps permissions]
Pocket Casts, for example, arranges your podcasts in a grid by default, but you can change this to a list format thats much easier to navigate with one hand. The official Reddit app allows you to swipe right to go back, eliminating the need to reach the faraway back button. Dig around in your favorite apps settings to see what features they offer that may make one-handed use easier. Every little bit counts.
You will also be interested:
- 2008 gmc sierra hd hood
- Jupiter in leo 8th house
- Plant terrarium with wooden stand
- Speech language pathologist salary
- 2006 ford focus 2 door
- French bulldog las vegas adoption
- Ghost adventures season 16 torrent
- 1/6 scale skeleton
- Tools of the trade griddle
- Best ip cameras for synology
Save on your new iPhone with special carrier deals at Apple. Find your deal
- A dramatically more powerful camera system.
- A display so responsive, every interaction feels new again.
- The world’s fastest smartphone chip.
- Exceptional durability.
- And a huge leap in battery life.
Super Retina XDR display with ProMotion
iPhone 13 Pro Max
iPhone 13 Pro
Super Retina XDR display with ProMotion
iPhone 13 Pro Max
iPhone 13 Pro
Our Pro camera system gets its biggest upgrade ever. With next-level hardware that captures so much more detail. Superintelligent software for new photo and filmmaking techniques. And a mind-blowingly fast chip that makes it all possible. It’ll change the way you shoot.
Macro photography comes to iPhone.
With its redesigned lens and powerful autofocus system, the new Ultra Wide camera can focus at just 2 cm — making even the smallest details seem epic. Transform a leaf into abstract art. Capture a caterpillar’s fuzz. Magnify a dewdrop. The beauty of tiny awaits.
Macro video, anyone?
Macro stills are just the beginning. You can also shoot macro videos — including slow motion and time-lapse. Prepare to be mesmerized.
iPhone 13 Pro was made for low light. The Wide camera adds a wider aperture and our largest sensor yet — and it leverages the LiDAR Scanner for Night mode portraits. Ultra Wide gets a wider aperture, a faster sensor, and all-new autofocus. And Telephoto now has Night mode.
The Wide camera captures up tox more lightfor better photos and videos
The Ultra Wide camera captures92% more lightfor better photos and videos
Night mode now on
Sharper, more detailed photos and videosin any light
The new Telephoto camera features a 77 mm focal length and 3x optical zoom — great for classic portraiture or shooting clearer photos and videos from far away. For closer subjects, try Portrait mode, where you can dial in the bokeh and experiment with studio-quality lighting effects.
- 3x optical zoom on Telephotofor closer close-ups
- 6x optical zoom range across the systemfor more framing options than ever
Now iPhone can shoot with shallow depth of field and automatically add elegant focus transitions between subjects. Cinematic mode can also anticipate when a prominent new subject is about to enter the frame and bring them into focus when they do, for far more creative storytelling. You have the option to change focus or adjust the level of bokeh even after capture. We can’t wait to see what you do with it.
- The only smartphone that lets you edit the depth effect after you shoot
- Shoot with the Wide, Telephoto, or TrueDepth camera in Cinematic mode
- Cinematic mode supports Dolby Vision HDR
To bring Cinematic mode to iPhone, we carefully studied how master filmmakers use rack focus to add drama and emotion to the story.
On Hollywood shoots, pulling focus requires a talented team of experts. Like a cinematographer, who makes the overall call about what’s in focus and when that changes. And a focus puller, who makes sure the transition is smooth, the timing is spot on, and the subjects are perfectly crisp.
Making all this happen automatically on your iPhone was no small feat.
First we had to generate high-quality depth data so Cinematic mode knows the precise distance to the people, places, and pets in a scene. And because this is video, we needed that depth data continuously — at 30 frames per second.
We also trained the Neural Engine to work like the experts. It makes on-the-fly decisions about what should be in focus, and it applies smooth focus transitions when that changes. If you want creative control, you can always hop in the director’s chair and rack focus manually, either when you shoot or in the edit.
It’s so computationally intense, we needed a chip that could handle the workload. Enter A15 Bionic.
The sheer computational power needed to run the machine learning algorithms, render autofocus changes, support manual focus changes, and grade each frame in Dolby Vision — all in real time — is astounding.
It’s like having Hollywood in your pocket.
your camera to lock in
Photographic Styles apply your preferred Tone and Warmth settings to your photos. But unlike filters, they keep things like skies and skin tones natural. Choose an Apple-designed preset — Vibrant, Rich Contrast, Warm, or Cool — and if you want, fine-tune it even further. Set your style once to get the look you love every time.
- Our advanced image pipeline renders your custom style in real time
- Increase for brighter, more vivid colors. Decrease for stronger shadows and contrast.
- Increase to enhance golden undertones. Decrease to bring in more blue undertones.
- The first smartphone to provide an end-to-end pro workflow, allowing you to record and edit in ProRes or Dolby Vision
The high color fidelity and low compression of ProRes let you record, edit, and deliver broadcast-ready content on the go. Now you can complete a project in ProRes entirely on your iPhone. Or easily bring ProRes videos from your iPhone into Final Cut Pro on your Mac.
tech behind every shot.
Smart HDR 4 optimizes each part of the scene.
Harnessing the machine learning power of the Neural Engine, Smart HDR 4 now makes unique adjustments for multiple people in a scene. Our software and ISP automatically refine contrast, lighting, and skin tones for each person. So everyone always looks amazing.
For mid- to low-light shots, Deep Fusion kicks in — using the Neural Engine to perform a pixel-by-pixel analysis of various exposures and fusing the best parts into your final image. It delivers extraordinary detail, bringing out even the subtlest textures in your photos.
- 77 mm focal length
- 3x optical zoom
- ƒ/ aperture
- Focus Pixels
- 6-element lens
- 13 mm focal length
- ƒ/ aperture
- Faster sensor
- Focus Pixels
- 6-element lens
- 26 mm focal length
- μm pixels
- ƒ/ aperture
- % Focus Pixels
- 7-element lens
- Sensor-shift OIS
Our three most powerful cameras ever
The TrueDepth camera system is a total Pro too, with:
- Cinematic mode
- Photographic Styles
- ProRes video recording
- Dolby Vision HDR recording
- Portrait mode
- Night mode selfies
- Smart HDR 4
- Deep Fusion
- and more
No wonder your selfies look so good.
- A15 Bionic and the TrueDepth camera also power Face ID, the most secure facial authentication in a smartphone
Meet the Hz adaptive refresh display that changes the game.
The new Super Retina XDR display with ProMotion can refresh from 10 to times per second, and all kinds of frame rates in between. It intelligently ramps up when you need exceptional graphics performance, and ramps down to save power when you don’t. It even accelerates and decelerates naturally to match the speed of your finger as you scroll. You’ve never felt anything like it.
- iOS 15 is optimized for ProMotion, so the things you do every day feel phenomenally fluid
Up your game.
The display’s ability to refresh up to Hz — combined with the amazing graphics performance of the new 5-core GPU on A15 Bionic — makes iPhone 13 Pro perfect for power gamers.
Up to25% brighter outdoorsfor content that looks even more vivid in sunlight
Custom OLED technologypushes the display’s incredible resolution and color right to the edge
Up to nitspeak brightness for your HDR photos and videos
Even more display areathanks to a smaller TrueDepth camera system
Incredible color fidelitymakes all your content look true to life
Striking contrast and resolutioncreates true blacks, bright whites, sharp detail, and crisp text
Running a display that refreshes times every single second requires a ton of power. But you don’t really need all that speed all the time.
One way to be more efficient is to set standard frame rates for different types of content. Say, 10 fps for a book and fps for a game. The problem with this approach is that frame rates are always changing. If the game drops to 30 fps for a menu screen while the display is set at fps, you end up using precious battery life without seeing any benefit from the higher frame rate.
For us, power is far too important to waste on empty frames. So we set out to design a more intelligent solution. One that can adapt to ever-changing refresh rates.
With ProMotion, there are no settings. Refresh rates are tied to whatever’s happening on the screen. If your game drops to 30 fps, ProMotion dips to 30 fps too. If you’re watching a video that was filmed at 24 fps, it plays at 24 fps. All of this saves power.
ProMotion makes it feel like you’re reaching right through the screen and touching the code.
We also considered the way your finger speeds up and slows down as you scroll, swipe, or pinch. The speed of your finger now drives the speed of each gesture. iOS 15 is full of moments where Hz makes the interface feel glued to your finger. It’s just so fast. But even then, ProMotion only uses Hz at the precise moment you’ll feel the impact.
It would have been much easier to put a Hz iPhone in your hand without worrying about battery life. But that’s not the Apple way. We wanted to deliver fast frame rates when you need them, and preserve battery life when you don’t.
All-new 5core GPUdelivers up to 50% faster graphics performance than any other smartphone chip
New CPU performance and efficiency corespower through complex tasks and preserve battery life
Superfast Neural Engineperforms up to trillion operations per second, enabling Cinematic mode, Smart HDR 4, and more
Advanced ISPtakes noise reduction and tone mapping to the next level
Secure Enclaveprotects personal information like your Face ID data, contacts, and more
On-device processingkeeps things like your Siri requests and interactions with Live Text private
Year after year, iPhone silicon pushes our idea of what’s possible with smartphones. A big reason why is that we build long-term product road maps — and bring our teams together — in ways no other company can.
Deep integration between our teams allows us to deliver features you can’t find on any other smartphone.
That’s how we deliver features like ProMotion, which have to be planned years in advance. Our chip team fully understood the needs of the display hardware, display software, and operating system teams and took them into account for A15 Bionic.
For example, we overhauled the display engine to support variable frame rates, then designed the system so ProMotion could capture the incredible graphics performance — and efficiency potential — of the new 5-core GPU.
In turn, the display software and iOS teams decided where all that speed would make the biggest impact, and where they could optimize refresh rates to use a lot less power.
What’s truly unique about Apple is that we don’t just start with a superfast chip and build features around it. Instead, we start with an idea about a great experience we’d like you to have, and then we all work together to bring it to life.
iPhone 13 Pro Max has
the best battery life ever on iPhone.
- Up to more hoursof battery life on iPhone 13 Pro Max
- Up to more hoursof battery life on iPhone 13 Pro
Add a MagSafe charger for faster wireless charging.
No one does 5G like iPhone.
The world is quickly moving to 5G. Streaming, downloading — everything happens so much faster. 5G is even fast enough for serious multiplayer gaming, sharing AR videos, and more. With Smart Data mode, iPhone will downshift automatically to save power when you don’t need all that speed.
- More 5G bands for
5G speed in more places
In the moment.
iOS 15 lets you keep the conversation going while sharing movies, music, or whatever’s on your screen right in FaceTime. Stay in the zone by filtering out any notifications that aren’t relevant to the task at hand. And interact with text in images to quickly send email, make calls, get directions, and more.
Learn more about iOS 15
Privacy is built in.
Privacy is built in.
iPhone helps put you in control of your personal information. For example, Privacy Nutrition Labels help you see how apps use your data. Apps need your permission to track your activity across other companies’ apps or websites. And that’s just for starters.Learn more about Apple
Good design is good for the planet.
Our stores, offices, data centers, and operations are already carbon neutral. By our products — and your carbon footprint from using them — will be, too. This year we eliminated the plastic wrap around the iPhone 13 and iPhone 13 Pro boxes, saving metric tons of plastic. And our established final assembly sites now send zero waste to landfills.Learn more about Apple
and the environment
Stream songs, albums, and curated playlists. Catch the shows everyone is raving about. Discover exciting new games. Keep up on the news and stories you love. Find your next favorite workout. Apple services put so much at your fingertips, and Apple One bundles them all into a simple subscription.Learn more about Apple One
Multiply the magic.
Everything you love about your iPhone gets even better when you use it with a Mac, iPad, or Apple Watch. It all just works together — seamlessly. Answer a call on whatever is close at hand. Take a photo on your iPhone and watch it instantly appear on your Mac. And see all your texts, all the time, on all your devices. Easy.Learn more about how Apple products work together
Check it out.
Take the perfect iPhone accessory and make it yours with free engraving — only from Apple.