Published 09:41 IST, April 7th 2023
Meta unveils SAM, a new AI model that can spot and segment items inside an image
Facebook's parent company Meta unveiled its latest artificial intelligence tool SAM, an abbreviation for Segment Anything Model.
Advertisement
Facebook's parent company Meta unveiled its latest artificial intelligence (AI) tool SAM, an abbreviation for Segment Anything Model. futuristic model, like its name suggests, possesses ability to spot and "segment" various items in an image or video. In a paper released on Wednesday, tech giant said that SAM "is designed and trained to be promptable, so it can transfer zero-shot to new image distributions and tasks".
"We evaluate its capabilities on numerous tasks and find that its zero-shot performance is impressive – often competitive with or even superior to prior fully supervised results," Meta explained on its official blog. According to company's research department, Segment Anything Model can spot any object inside an image or a clip, even if it has never witnessed some of those items during its training stage.
Advertisement
new AI model works rar simply, and allows users to select objects by eir tapping on m or writing m in text prompts. A demo of process showed that writing "cat" alerted tool to create boxes around cats in one image. SAM's release comes alongside a dataset that will "foster research into foundation models for computer vision".
SAM to broen access to existing technology
While AI tool is brand-new, Meta has used similar technology in past to carry out basic activities such as moderating inappropriate content, tagging images, and deciding which posts can be sent as recommendations for users on Facebook and Instagram.
Advertisement
However, company believes that release of SAM would widen scope and extent of such of technology. new model and its corresponding dataset will soon be up for downloing under a non-commercial license. To make sure that technology is not used for malicious purposes, users will have to agree to uplo ir pictures for research-based reasons only.
09:41 IST, April 7th 2023