A smarter ‘smart building’
A connected, smart building which is aware of its state and current business context.

A system comprising: data acquisition controller to receive, from a plurality of collectors in a business environment, business activity data samples comprising at least one of audio data, image data, machine data, or sensor data; business activity type classifier circuitry to search against previously classified business activity data samples associated with baseline activity scores and identify a baseline business activity score; content recognition circuitry configured to recognize business activity context in the business activity data samples and match the business activity context information with content from a provider circuitry configured to communicate a message including at least one of the modified activity context information, the variance score, or the adjusted activity score.
The business event modeler 140 may determine business ‘energy’ levels for a group of events. The business energy level may be metric that quantifies the significance of a group of business events and/or business activities to the corporation/ organization.
Thus, the business energy level may include the business activity scores and/or the business event scores. For example, the event modeler 140 may aggregate multiple business events identifiers in order to compute an overall business energy score as a function of the estimated event scores of each particular business event identified. Each of the event identifiers may be associated with the event weight 306 which can be used in the computation. The business energy level may be calculated according to the following model:

The state of such a connected building, may be queried via APIs. For example, business activity information of an organization acquired through audio, video, and machine communication may be contextualized with scores and exposed via APIs. In addition, the system may select certain activity information based on the scores; then autonomously synthesize and broadcast messages — according to a predefined template and strategies.

The machine data 110 may include information generated by machines in the organization. Machines may include workstation computers, local area networks, switches, routers, mobile phones, or any other machine used in an organization. In some examples, the machine data samples 110 may include information generated by software, for example an operating system, executing on the operating systems. Thus, by way of non-limiting examples, the machine information may include CPU utilization, LAN utilization, energy consumed, user login/logout events, and any other type of information related to the machine. In other examples, machine data samples may also include information from, and/or related to, robot activity, virtual reality device usage, augmented reality device usage, laptops, smartphones, public screens, content presented on public screens, content accessed, internal corporate portal activity and content request summaries — including, for example, keywords contextual information and document exchange. In other examples, the machine data may include an indication of an occurrence of a predetermined type of activity.
Machine data may include, for example, doors usage, such as doors opening and closing, ventilation system usage, such as HVAC systems turning on and off, elevator usage, the on/off of lights, room occupancy status (i.e. occupied, unoccupied, and number of occupants). The sensor data 111 may include any other activity detected
The plurality of the different independent locations 112 may include physical areas within the organization. For example, the physical locations may include meeting spaces, hallways, recreational areas, offices, public areas, or any other type of physical space in the connected building. The organization may include the locations where individuals and/or machines conduct related activities. In some examples, the organization may include a building and/or plurality of buildings. Alternatively or in addition, the organization may be distributed across multiple geographic areas.
The business activity types may include any type of activity that can occur in the organization in either physical spaces or virtual spaces. For example, the activity types may include, a person walking into a room, a business meeting, a person logging into a workstation, CPU utilization on a work station, network traffic, and/or any other classification of activity that occurs in physical space and/or virtual space. The activity type identifier 120 may uniquely identify the activity type.
For example, a data collector configured to sense light may determine that the lights in the room are turned on. In addition, a data collector configured to sense motion may determine that a person has entered the room. The activity database 117 may store an association between an activity type identifier indicative of “a person entering a room” with the activity data samples 102 from the data collector configured to sense light and the data collector configured to sense motion. When new activity data samples are received from the data collectors 114, the new activity data samples may be compared with previously associated activity data samples to determine the activity type identifier for the new activity data samples.
The activity modeler 116 may determine energy metrics that quantify the significance of business activities identified through the activity data samples 102. For example, the baseline-activity score 122 may quantify the significance of the business activity type to the organization. The significance of the activity type to the organization may reflect, for example, the value of the activity type, the importance of the activity type, and/or any other interest the organization has in the activity type.
In addition to managing the activity object 118, the system may further manage a business event object 128. The event object 128 may include, or be associated with, any information acquired, created, and/or managed by the system related to an event. An event may include any group of related activities in an organization. Examples of the events may include a meeting, a presentation, a client demo, an ad-hoc un-registered arrangement of people or any other planned or unplanned collection of related activities. The events may be predefined for each of the activity types and/or groups of activity types. For instance, a meeting may be expected to be accompanied by certain activity type (entering a room, exiting the room, talking, presenting, handshaking etc.) and certain activity contexts associated with the event (i.e. the identifies of certain attendees, content of the conversations discussed, etc).
The predefined business event information may originate from the user of the system and may include expected information about the event; for example, an expected number of participants to a meeting, an expected time of the event, an expected location of the event, energy level and/or any other information about the event. The event context 136 may include particular keywords/products mentioned, participants of the meeting, topics discussed etc.
In addition to organizing the activity data samples into the activity object 118 and the event object 128, the system may implement security measures to ensure that sensitive information of the organization is not communicated.
A variety of examples exist where the content controller may modify the organization information to generate the anonymized context information. The content controller 142 may, for example, modify a photo of a dry-erase board to remove sensitive writing on the dry erase board. The content controller 142 may modify information including product information, such as a product name, where the product information is modified and/or removed. The content controller 142 may remove and/or alter particular logos and/or symbols from images. The content controller 142 may remove and/or anonymize faces in a photo.
In some examples, the content controller 142 may remove, and/or anonymize faces which are identifiable in an image. Alternatively or in addition, the content controller may crop a photo and/or apply filters to make sure that there are no identifiable faces in the image. The content controller 142 may compare the faces with the information contained in the permissive content repository 146 and/or restricted content repository 144. Alternatively or in addition, the content controller 142 may evaluate the activity context data associated with the image to determine if the face in an image should be removed and/or anonymized. While the previous example applies to faces, the anonymization based on restricted content, permissive content, and activity context information may equally apply to particular voices, logos, symbols, discussions of organization projects, competitor names, product information, and/or any other organization information recognized in audio and/or video data.
The content controller 142 may apply key word searches to determine if organization information is private or non-private. For example, the restricted content 144 may include words, phrases, products, and/or groups of characters that have been reserved as private. Similarly the permissive content repository 146 may include words, phrases, and/or groups of characters that have been reserved as non-private. The content controller 142 may perform key-word searches, including Full text search on the organization information to determine if the organization information includes any information included in the backlist 144 and/or the white list 146. In other examples, the restricted content 144 and/or the permissive content 146 may include data patterns, images, audio clips and/or any other form of information that may be compared to determine if the organization information is private or public.
The activity information may include organization information. Organization information may include entities, objects, people, documents, texts, conversations, products, or any other information and/or article of information recognized from the activity data samples 104.
The energy metrics means any quantitative value assigned to activity information to contextualize the significance of the activity information. Energy metrics may include baseline scores, adjusted scores, weight factors, or any other type of metric. For example, the energy metrics may include the baseline activity score 122, the adjusted activity score 126, the baseline event score 134, the adjusted event score 138, the weight factor 121. In addition, the energy metrics may include any additional metrics, such as the energy score of multiple events. The word “score” is synonymous with metric. In some examples, the energy metrics may be defined on a non-zero scale of 0 through 10. In other examples, the energy metrics may include numeral scales, ordinary scales such as human readable levels of business energy, and/or estimation based fit of particular statistical distributions.
The content recognizer 204 may perform content recognition on the activity data samples 104 to determine the activity context information 124 included in the activity data samples 104. For example, the content recognizer 204 may be configured to recognize data patterns 218 in the activity data samples 104 and identify the activity context information 124 associated with the data patterns 218. In some examples, the content recognizer 204 may include audio recognition algorithms, image recognition algorithms, and/or video recognition algorithms. For example, the content recognizer 204 may include a speech-to-text processor and an image recognition processor. Examples of the speech-to-text processor may include hidden markov models, neural networks, deep learning, and any other speech-to-text process known in the art. Examples of image recognition processors may include Kernel PCA, latent semantic analysis, partial least squares, principal component analysis, multifactor dimensionality reduction, nonlinear dimensionality reduction, multi-linear principal component analysis, or any other image recognition process known in the art.
The content recognizer 204 may also identify a topical indicator included in the activity data samples 104. The topical indicator may include a classification of content included in activity data samples 104. In additional examples, the activity data samples 104 and/or the activity context information 124 used to form the topic indicators include content and document access, meeting agenda and meeting minutes, words captured in meeting rooms, content presented, content consumed by public displays, explicitly set topics and more.
CONTENT RECOGNITION AND COMMUNICATION SYSTEM/ Patent application US-15/445340