The Samsung Group of Seoul, South Korea, is a conglomerate made up of many subsidiaries that are in the business of developing different electronics. Samsung’s line of products is as varied as washing machines, televisions, microwaves and handheld electronic devices. Recently, the manufacturer announced that it would be getting more serious in the tablet market with the upcoming release of the 12.2-inch Galaxy Note tablet.
In this edition of IPWatchdog’s Companies We Follow our series returns to focus once again on Samsung and its recent appearances at the U.S. Patent & Trademark Office. As has often been the case recently, many of the more intriguing patents and patent applications from Samsung deal with electronic device development. One patent document protects a better system of constructing biochips to monitor drug trials. An application filed by Samsung describes a devised method of allowing mobile phones to give off fragrance in response to user interaction. Upgrades to electro-wetting displays, which use water and oil to affect light displays, are featured in a second patent application.
Smarter computing systems are also a major focus for Samsung. We also look at an application that would protect a system for controlling social network user interactions based on emotional states, and more efficient systems of detecting eye regions for facial recognition.
Apparatus and Method for Sharing User’s Emotion
U.S. Patent Application No. 20130144937
Social networks are major Internet terminals through which millions of online users can interact digitally on a daily basis. Of the many personal matters that can be shared, a user’s current emotional state can regularly be determined by analyzing a social media post. Research conducted on social media interactions have shown that one user’s emotional state can affect the emotional state of other users, whether positively or adversely. The more that these users interact, the greater impact each can have on the other’s emotional state.
This Samsung patent application would protect a system of classifying social network user emotional states to determine an emotion rate between two users. An emotional classification unit would analyze user posts and interactions to determine current emotional state and any rate of change in user emotional states. It appears that Samsung is seeking to improve social media communications by limiting interactions between members who provoke negative emotions, or increasing interactions between members who instill positive emotions.
Claim 1 of this patent application would protect:
“An emotion sharing apparatus comprising: an emotion classification unit configured to classify a recognized emotional state of a user of a terminal into one of at least two kinds of emotional states; an emotion analysis unit configured to calculate an emotion rate defined based on a ratio of one kind of emotional state of the at least two kinds of emotional states to another kind of emotional state of the at least two kinds of emotional states; a change-in-emotion rate calculator configured to calculate a change in the emotion rate; and a transmitter configured to transmit the change in the emotion rate to another terminal.”
Mobile Communication Terminal Having Fragrance Member
U.S. Patent No. 8457695
Digital electronic devices are capable of providing multimedia that can appeal to various human senses, like video or music. There have been attempts to include components within devices that can also interact with a human’s sense of smell. However, plastic fragrance cards or aromatic liquid capsules are either depleted or weaken with time. Exchanging these cards or capsules can be cumbersome and is often expensive.
Samsung has earned the right to protect the method of creating a fragrance member for mobile phones that requires very little exchange of aromatics, requiring minimal replacement. The fragrance member could be composed as an inner layer within a sliding mobile phone, or as a fragrance rod that rests within the hinge of a folding mobile phone. The natural creation of friction within the mobile phone components would cause the fragrance member to release an aroma.
As Claim 1 states, Samsung has gained protections over:
“A mobile communication terminal, comprising: a fragrance member, the fragrance member comprising a synthetic resin and an aromatic; a fixed body comprising an upper surface where a key pad is provided; and a movable body comprising an upper surface where a display is provided and a lower surface opposite the upper surface, the lower surface of the movable body being slidably connected to the upper surface of the fixed body, wherein the fragrance member is disposed on the upper surface of the fixed body to generate friction between the lower surface of the movable body and the fragrance member in response to the movable body sliding.”
U.S. Patent Application No. 20130137168
Computer chips that monitor and record a wide array of biological data have been constructed and used in the past. Biochips and biotechnologies have been developed to aid in research on genetic DNA composition and cellular protein processes. However, using biochips to research experiments, especially for drug and medication research, is difficult due mainly to limitations in coupling a data chip for biologic data logging with a meta chip containing a medication.
This patent application, filed by Samsung, describes the South Korean manufacturer’s development of a biochip including both a data chip and a meta chip which can be coupled manually. The biochip includes various coupling units on the substrate surface and near the end plates of the chip, which use a cone-shaped column to fit within a groove on a second surface, connecting the two pieces together securely. A medication would be stored within the chip’s substrate so that it can be applied to a patient when commanded.
Claim 1 of this Samsung patent application would offer the corporation the right to protect.
“A biochip, comprising: a first substrate having a plurality of biological materials disposed thereon at predetermined intervals; first coupling units respectively extending from both ends of the first substrate and rotatably connected to the first substrate; and a second substrate including first groove portions formed in both ends thereof, the first coupling units being respectively inserted into the first groove portions.”
Electro-Wetting Display Substrate and Method of Manufacturing the Same
U.S. Patent Application No. 20130141317
The electro-wetting display (EWD) for electronic computing devices is a recent development that utilizes aqueous and non-aqueous solutions to control the light emitted by a display monitor. An EWD transmits an electrical charge across a layer of water contained within the display. Voltage differences between electrodes within the display create surface tension in the water. Oil in the non-aqueous solution is repelled from the water in response to this voltage difference, allowing varying levels of light to be transmitted through the screen.
This Samsung patent application reflects the development of an EWD that has an increased aperture ratio for pixels due to a different electrode design. A pixel electrode, which controls the amount of voltage traveling through a specific pixel, is overlapped with a notch electrode that is connected to a drain electrode. The notch, pixel and drain electrodes are all connected through a switching element that receives a voltage from a common pad.
As Claim 1 of this patent application explains, Samsung wants to protect:
“An electro-wetting display substrate comprising: a base substrate including a gate line extending in a first direction and a data line extending in a second direction, wherein the first direction is different from the second direction; a switching element electrically connected to the gate line and the data line; a pixel electrode electrically connected to the switching element; a notch electrode disposed adjacent to the switching element and overlapping the pixel electrode; and a water-repellent layer disposed over the pixel electrode.”
Apparatus and Method for Detecting Eyes
U.S. Patent No. 8457363
Facial recognition systems are typically used as a method of providing secure access to authorized persons based on biological features. These recognition systems can analyze eyes, mouths, noses and other facial structures from pictures in an image bank that have been assigned to a particular individual. Eyes in particular are important as they don’t change much in shape, even when a person’s facial expression changes. Infrared strobes are sometimes used to detect eye positioning, but shortcomings in this method include difficulties in detecting pupils behind eyeglasses and an inability to detect both eyes at once.
Samsung was awarded a USPTO patent recently to protect a more efficient system of detecting eye regions in facial recognition systems. An eye detection apparatus receives an image input and divides a facial image into left and right portions. A support vector algorithm is then applied to each half of the image to detect whether an eye is contained within that image.
As Claim 1 explains, the USPTO has given Samsung the right to protect:
“An apparatus for detecting eyes, comprising: an eye candidate detector which divides an input face image into left and right images and detects at least one eye candidate from each of limited image regions of the left and right images by binarizing each of the limited image regions based on threshold values corresponding to each of the limited image regions; an eye candidate evaluator which evaluates the eye candidates by evaluating each combination of the eye candidates using geometric information as to the eye candidates to filter out eye candidates that cannot be eyes; a learning database which stores a plurality of face images in which positions of eyes are arranged and a plurality of face images which do not include eyes or in which positions of eyes are not arranged; and an eye candidate verifier which verifies the eye candidates with reference to the learning database and outputs an eye detection result signal, wherein the eye candidate verifier includes: a support vector machine (SVM) classifier which receives from the eye candidate evaluator the face image including the eye candidates and calculates an output value using an SVM algorithm; a maximum output value selector which selects a maximum output value among the calculated output values; and an eye determiner which, when the maximum output value is at least equal to a threshold value, determines that the detection of the eyes has succeeded, and when the maximum output value is less than the threshold value, determines that the detection of the eyes has failed, and wherein the SVM classifier includes: a first support vector machine sub-classifier which calculates an output value for a general face; a second support vector machine sub-classifier which calculates an output value for a face with eyeglasses; and a third support vector machine sub-classifier which calculates an output value for a face with long hair, wherein the eye candidate detector binarizes the input face image based on a first threshold value, extracts outlines from the limited image regions of the binarized face image, determines whether the outlines are satisfactory based on shapes and sizes of the extracted outlines and, when it is determined that the outlines are not satisfactory outlines, increasing the first threshold by a first value and repeating the binarization of the input face image based on the first threshold value and the extraction of the outlines from the limited image regions of the binarized face image; and detects eye candidates within the satisfactory outlines, and wherein before the eye candidates within the satisfactory outlines are detected, the detection of the at least one eye candidate from the limited image regions of the left and right images includes, when a number of satisfactory outlines of the extracted outlines is less than a value M, increasing the first threshold value by a second value less than the first value and repeating the binarization of the input face image based on the first threshold value and the extraction of the outlines from the limited image regions of the binarized face image.”
Join the Discussion
No comments yet.