need of data mining in security systems

benefits of video security systems in the mining industry | mistral solutions

Video surveillance systemscan be used in a variety of ways within the mining industryThe development of specialised, integrated video security systems for specific environments has vastly increased the benefits they deliver to users. However, the mining industry has considerable scope for exploiting the more sophisticated functions provided by integrated systems with data mining techniques which have added significant value in other business environments. In this article Jan de Beer,MultiVid; Johan Raubenheimer,Fourier Systems; Carmen Lahr,Geutebruck GmbHoutline how this integration can provide mining operations with a very powerful tool.

Besides perimeter security, precious metal / stone mines normally have hundreds of CCTV cameras installed throughout the plant. These surveillance cameras observe the many processes within the plant and are mostly installed in a permanent, overt manner, with portable covert cameras being used on a temporary basis. The control room operators observe the general movement of plant and people, but invariably with limited success. Without the ability to effectively highlight unusual or high-risk events, video surveillance operators have too many security cameras to monitor and cannot effectively detect the tell-tale events that could betray suspicious activities.

However, the mature intelligent video management systems that are now available can provide strategic assistance and integrate with virtually any source of event data. Complex rules can be constructed to take into account related, but disparate events, which indicate impending failure or situations that warrant further investigation. These enable the video operators attention to be targeted much more effectively to potentially productive areas.

Nowadays any well-developed business administration system uses many tools to achieve business goals. Some industries have highly sophisticated systems which integrate video analysis and other data, allowing automated in-depth analysis to a degree impossible for humans within a workable timeframe. To reveal the potential for adopting and adapting similar systems for use in a mining environment we first need to look at systems already employed. One such example is the toll plaza, where a lot of money changes hands in what can be a very remote location.

When this kind of technology is used for monitoring transactions at toll plazas, it can handle metadata including identity of operator, event inputs from axle-counting and other road-side devices, vehicle number plate recognition, transaction detail, video motion detection, date and time etc. For toll applications a rule-based data processing engine generates events for detected anomalies. The system extracts off-line images for each of these events, while a powerful easy-to-use query tool allows the user to drill down to the relevant exceptions. This enables management to view each anomaly rapidly, and either, after auditing the event, close it off for higher management check-up, or print it via a customised and automated report writer. In the latter case an operator disciplinary investigation is possible. And being linked directly to the DVR or NVR system means the recordings can be reviewed immediately. Without this kind of rapid event viewing system, verification is a laborious or virtually impossible task, and users seldom verify events that take place.

In the mining environment there are many process control elements such as weight measurements, conveyor belt control mechanisms (normal control and emergency stop), ore transportation systems, ore concentration or extraction processes and finally the product measuring and transport systems.Access control card readersare often installed throughout the plant to restrict the movement of people, and extended alarm systems monitor both the plant process operation and also the unauthorised opening of distribution boxes, Programmable Logic Controllers (PLCs) and many other control systems.

In the ideal intelligent electronic security system all the events captured in the sub systems would be critically analysed in a proper trend analysis. This would not only extract relevant process control data, but more specifically, the security risks. For example, the emergency stoppage of a conveyor belt might be an orchestrated event to create an environment in which the mined product can be removed by hand during the resultant spillage recovery operation.

Currently in such situations consideration is rarely given to which operators and security staff are on duty in the plant at the time. A well-defined trend analysis could quite conceivably extract the fact that it is always the same staff on duty when this transpires. To analyse this trend on a manual basis is nigh impossible. Obviously the risk associated with this particular exception varies depending on the mining environment, as the concentration of the mined product may be so low that the risk is minimal. However, it may play a role within the context of the bigger picture.

Although it may be normal to investigate the unauthorised opening of control boxes, the authorised opening or dismantling of the same boxes should also be investigated for trends and hence suspicious patterns. Besides this, other plant failures such as motors, pumps etc, are seldom considered.Video motion detection events should be verified against all other events that take place, and both overactive alarms and under-active alarms should be considered.

The first weight measurements mentioned above are seldom compared against the measurement of the final product. The extended time factor from the start to finish of processing usually makes manual checking impractical.However, many sophisticated systems can be installed along this process and valuable events may be extracted from these systems to capture and highlight a trend in an exception reporting format. It is quite conceivable that a trend analysis would pinpoint a problem long before the end of the ore processing time has been reached.

Clearly any system helps security staff concentrate on more fruitful activities increases efficiency and improves security.But the availability of a customised video security system which is integrated with other control and management systems can provide many mining operations with a very powerful tool which can deliver significant benefits in other areas too, such as safety and processing efficiency for example.Once managers realise the power at their fingertips they see new opportunities for applying it and considerable added value can be derived from the investment.

To produce an effective system a considerable amount of customisation is involved, so we are not talking here about off-the-shelf products.As a potential purchaser, you need to identify a suitable supplier who has relevant products and experience, and you also need to ensure that the chosen system has already been developed and installed successfully. This is to avoid vapourware or pie-in-the-sky systems, which still need development.

You will need to confirm that all your currently installed systems can be interfaced to enable the necessary event extraction by external software controlled systems, and that software development kits (SDKs) and application programming interfaces (APIs) are available to facilitate this. If they are not, then your software integrator must be able to extract the data by other means. He too needs to be selected for the appropriateness of his skills and track record.

Alongside supplier representatives, the project team should include someone to represent you, the end-user.This persons role is to provide the input to enable the definition and creation of a rules base which will then form the basis of the analysis through a filter process, so he needs to have an intimate knowledge of plant processes.

Any potential purchaser needs to realise that the system will need to be fine-tuned during use as everyone on the team will find methods of improving the system performance and because of this, on-going software maintenance will form part of the complete solution.

mining security case study - cathexis

Still considered a critical contributor to the local economy, the South African mining sector boasts a total annual income of R550 billion and is one of the countrys biggest employers. With South Africas economy built on gold and diamond mining, the sector is an important foreign exchange earner, with gold accounting for more than one-third of exports. Gold Mining in South Africa also accounts for over 10% of the worlds gold production.

Headquartered in Johannesburg, South Africa, AngloGold Ashanti is the second largest global mining company with 19 gold mining operations based in nine countries, as well as several exploration programmes in both the established and new gold producing regions of the world.

Developing a security management system for the mining sector is not a simple task. Mining operations are typically characterised by multiple sites in several remote locations, often based in hostile environments with extremely limited infrastructure. In addition to preventing incidents that involve security breaches and theft, the security or risk management plan must take a number of other factors into consideration, including occupational health and safety and the threat of labour unrest by hundreds of workers on site.

A typical mining security management system would combine various aspects of security features, including: Manned Guarding, CCTV Surveillance, Access Control, Perimeter Security, Fire Protection, Alarms, X-Ray Systems and Plant Management Systems to mention a few. AngloGold Ashanti with its operations based throughout various countries in Africa, is no exception.

AngloGold Ashantis African CCTV solution includes some 2 500 cameras distributed across about 12 geographical locations, including mines and plants in South Africa, Ghana, Tanzania and Guinea. Cathexis Technologies has provided AngloGold Ashanti with its flagship Video Management Surveillance System, CathexisVision for the past 15 years, making all of these sites accessible remotely for support and centralised monitoring capability using the CathexisVision software.

While security is certainly a high priority for the mining sector, risk management has to work closely with other departments to streamline operational costs effectively and increase return on investment.Therefore, AngloGold Ashanti was recommended a solution, which through its integration capabilities with 3rd party systems and powerful analytics functionality, could generate actionable information, not only for security management but all mining operations.

CathexisVision provides a sophisticated management tool that enables operators and managers to function in a multi-tiered control environment with multiple CCTV cameras installed on various different sites and locations. The system manages these multiple sites form a central command centre.

One of the major requirements of the CathexisVision installation at the AngloGold Ashanti mining sites has been the need to upgrade the CathexisVision software on an annual basis to ensure that the installation is always kept up-to-date with the latest international trends and developments and that the surveillance system is reaping the full benefits of the very latest CathexisVision features. As with all residential estates, there are always certain physical challenges that can complicate the security management system, but thanks to the intuitive characteristics associated with CathexisVision, the VMS system works hand in hand with surveillance cameras and 3rd party systems to provide a series of pre-determined actions, based on alarm triggers received, accelerate operator efficiency and allow for instant control room reaction. Exceptional integration capability, real-time monitoring, and the ability to operate reliably in any climate, makes Cathexis a leading global provider of mining security solutions

Given that Cathexis has worked with AngloGold Ashanti for just over 15 years, this has necessitated the need to migrate an analog CCTV system into a digital, networked solution, as per global market trends.

However, given the signicant investment in surveillance systems, Cathexis is able to apply its technology and integrate with existing analog systems, bringing them up to speed with the latest IP technology. In addition, CathexisVision supports most popular brands of IP cameras, using both ONVIF or proprietary protocols.

most common data mining security issues you need to be aware of

Data is only increasing every day and deals with the most sensitive information about different businesses. Data hold dear to every business and thus they try to protect it with the most advanced set of technologies.

With the massive amount of data getting generated today it is very much obvious that it will attract a lot of costs related to its storage as well as maintenance. Thus its getting transferred to cloud platforms.

Not only this, but regular updates and mining on this data that has to be done can also fall prey to various security because a lot of to and fro of data takes place from cloud to various resources (which might be corrupted or non-reliable).

Access controls are basically to verify the identity of the person trying to access data. A single layer access control might seem an easy way to protect the data but it surely is not a secured option.

Auditors while auditing the security architect of the organization also consider these acceptable standards for analyzing the breach possibilities, thus giving proper health of the existing security architecture.

Many new techniques like auto-tiering are nowadays used in storing bulky data but it comes with its own negativity as the data storage solution provided by auto-tiering wont give any track of where the data is being stored.

To counter the large data computations, a technique like MapReduce is used whose main task is to split the data into chunks but the drawback of this technique is no control of users over the process of distributed computations.

WisdomPlexus publishes market specific content on behalf of our clients, with our capabilities and extensive experience in the industry we assure them with high quality and economical business solutions designed, produced and developed specifically for their needs. Read More

Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.

Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.

10 challenges to big data security and privacy - dataconomy

Big Data could not be described just in terms of its size. However, to generate a basic understanding, Big Data are datasets which cant be processed in conventional database ways to their size. This kind of data accumulation helps improve customer care service in many ways. However, such huge amounts of data can also bring forth many privacy issues, making Big Data Security a prime concern for any organization. Working in the field of data security and privacy, many organizations are acknowledging these threats and taking measures to prevent them.

On November 25th-26th 2019, we are bringing together a global community of data-driven pioneers to talk about the latest trends in tech & data at Data Natives Conference 2019. Get your ticket now at a discounted Early Bird price!

The reason for such breaches may also be that security applications that are designed to store certain amounts of data cannot the big volumes of data that the aforementioned datasets have. Also, these security technologies are inefficient to manage dynamic data and can control static data only. Therefore, just a regular security check can not detect security patches for continuous streaming data. For this purpose, you need full-time privacy while data streaming and big data analysis.

Data stored in a storage medium, such as transaction logs and other sensitive information, may have varying levels, but thats not enough. For instance, the transfer of data between these levels gives the IT manager insight over the data which is being moved. Data size being continuously increased, the scalability and availability makes auto-tiering necessary for big data storage management. Yet, new challenges are being posed to big data storage as the auto-tiering method doesnt keep track of data storage location.

End-point devices are the main factors for maintaining big data. Storage, processing and other necessary tasks are performed with the help of input data, which is provided by end-points. Therefore, an organization should make sure to use an authentic and legitimate end-point devices.

Computational security and other digital assets in a distributed framework like MapReduce function of Hadoop, mostly lack security protections. The two main preventions for it are securing the mappers and protecting the data in the presence of an unauthorized mapper.

Due to large amounts of data generation, most organizations are unable to maintain regular checks. However, it is most beneficial to perform security checks and observation in real time or almost in real time.

A secured data storage device is an intelligent step in order to protect the data. Yet, because most often data storage devices are vulnerable, it is necessary to encrypt the access control methods as well.

Analyzing different kinds of logs could be advantageous and this information could be helpful in recognizing any kind of cyber attack or malicious activity. Therefore, regular auditing can be beneficial.

Data stores such as NoSQL have many security vulnerabilities, which cause privacy threats. A prominent security flaw is that it is unable to encrypt data during the tagging or logging of data or while distributing it into different groups, when it is streamed or collected.

Organizations must ensure that all big data bases are immune to security threats and vulnerabilities. During data collection, all the necessary security protections such as real-time management should be fulfilled. Keeping in mind the huge size of big data, organizations should remember the fact that managing such data could be difficult and requires extraordinary efforts. However, taking all these steps would help maintain consumer privacy.

Peter Buttler is an Infosecurity Expert and Journalist. He Holds a Masters degree in Cybersecurity and technology, vaving 7 years of experience in online security and privacy. He interviews with security authorities to present expert opinions on current security matters. While writing, he emphasizes on serious security threats that have an impact worldwide. You can follow him on Twitter @peter_buttlr.

Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.

Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.

why you need a data warehouse | james serra's blog

You likely have heard about data warehousing, but are unsure exactly what it is and if your company needs one. I will attempt to help you to fully understand what a data warehouse can do and the reasons to use one so that you will be convinced of the benefits and will proceed to build one.

In my experience, not nearly as many companies have a data warehouse as I would have expected. And many of those that say they have a data warehouse dont really have a true data warehouse, but rather a dumping ground for tables that are copied from source systems with little modification.

For a company to be successful in the future, they must make good decisions. And to make good decisions requires all relevant data to be taking into consideration. And the best source for that data is a well-designed data warehouse.

The concept of data warehousing is pretty simple: Data is extracted on a periodic basis from source systems, which are applications such as ERP systems that contain important company info. Data from these systems is moved to a dedicated server that contains a data warehouse. When it is moved it is cleaned, formatted, validated, reorganized, summarized, and supplemented with data from many other sources. This resulting data warehouse will become the main source of information for report generation and analysis via reporting tools that can be used for such things as ad-hoc queries, canned reports, and dashboards.

Building data warehouses has become easier over the years due to improvements in the tools, improvements in the processes (i.e. see Ralph Kimball Books) and a better understanding of the architectures (seeBuilding an Effective Data Warehouse Architecture). And of course there are consultants who can help!

A goal of every business is to make better business decisions than their competitors. That is where business intelligence (BI) comes in. BI turns the massive amount of data from operational systems into a format that is easy to understand, current, and correct so decisions can be made on the data. You can then analyze current and long-term trends, be instantly alerted to opportunities and problems, and receive continuous feedback on the effectiveness of your decisions. See Why you need Business Intelligence.

The concept of a data warehouse is not difficult to understand. Basically the idea is to create a permanent storage space for the data needed to support reporting, analysis, and other BI functions. While it may seem wasteful to store data in multiple places (source systems and the data warehouse), the many advantages of doing that more than justify the effort and expense.

Data warehouses reside on servers dedicated to this function running adatabase management system (DBMS) such as SQL Server and using Extract, Transform, and Load (ETL) software such as SQL Server Integration Services (SSIS) to pull data from the source systems and into the data warehouse.

The data needed to provide reports, dashboards, analytic applications and ad-hoc queries all exists within the production applications inside your company, so why not use the BI tools directly against this data? Well, there are manyreasonswhy you would want to use a data warehouse instead of the direct access approach:

Even though additional hardware and software are needed, the presence of a data warehouse costs less and delivers more value than a direct connection. With the continued drop in costs for processing power and storage, that makes the case for a data warehouse even stronger.

James is a Data Platform Architecture Lead at EY, and previously was a big data and data warehousing solution architect at Microsoft for seven years.Before that he was an independent consultant working as a Data Warehouse/Business Intelligence architect and developer. He is a prior SQL Server MVP with over 35 years of IT experience.

Thanks for the post Malik. I do think the trend is changing. I have been surprised by how many companies do not have a true data warehouse. But it seems most realize the benefits of a data warehouse and BI and are taking steps to build a true DW along with BI solutions. I get a lot of requests from clients to come in and help them understand how to build a DW and the proper architecture. Most want to know best practices and how everyone else is building a DW. Its a smart move to bring in a consultant who has built many DWs and get guidance and certainty that you are building it correctly than to just do it all on your own, especially if you have not build one before. Kinda like trying to build a house on your own without asking a contractor for help!

Though I have never been a fan of a 3NF DWH, there are good uses for 3NFincluding hierarchy enforcement in a staging area. What really chaps my @$$ is the misuse of the term data mart. It is the old business dept vs business process subject area argument.

The sad fact, however, is that it is human nature to cluster and operate in groups, and that tendency is demonstrated to the fullest in most unsuccessful DWH projects. I believe human nature itself is the biggest reason for DWH failure, and that is why we do not see more of them. However, when properly incenitivized and shown a truthful proof-of-concept, data warehouse projects can succeed in the same manner in which a good bill becomes a productive law in the USeducation, executive support, compromise, and no bulls__t leadership.

data mining process: models, process steps & challenges involved

Data Mining, which is also known as Knowledge Discovery in Databases is a process of discovering useful information from large volumes of data stored in databases and data warehouses. This analysis is done for decision-making processes in the companies.

Data Mining is a process of discovering interesting patterns and knowledge from large amounts of data. The data sources can include databases, data warehouses, the web, and other information repositories or data that are streamed into the system dynamically.

With the advent of Big Data, data mining has become more prevalent. Big data is extremely large sets of data that can be analyzed by computers to reveal certain patterns, associations, and trends that can be understood by humans. Big data has extensive information about varied types and varied content.

Thus with this amount of data, simple statistics with manual intervention would not work. This need is fulfilled by the data mining process. This leads to change from simple data statistics to complex data mining algorithms.

The data mining process will extract relevant information from raw data such as transactions, photos, videos, flat files and automatically process the information to generate reports useful for businesses to take action.

Any business problem will examine the raw data to build a model that will describe the information and bring out the reports to be used by the business. Building a model from data sources and data formats is an iterative process as the raw data is available in many different sources and many forms.

CRISP-DM is a reliable data mining model consisting of six phases. It is a cyclical process that provides a structured approach to the data mining process. The six phases can be implemented in any order but it would sometimes require backtracking to the previous steps and repetition of actions.

#2) Data Understanding: This step will collect the whole data and populate the data in the tool (if using any tool). The data is listed with its data source, location, how it is acquired and if any issue encountered. Data is visualized and queried to check its completeness.

#4) Modeling: Selection of the data mining technique such as decision-tree, generate test design for evaluating the selected model, building models from the dataset and assessing the built model with experts to discuss the result is done in this step.

#5) Evaluation: This step will determine the degree to which the resulting model meets the business requirements. Evaluation can be done by testing the model on real applications. The model is reviewed for any mistakes or steps that should be repeated.

#6) Deployment: In this step a deployment plan is made, strategy to monitor and maintain the data mining model results to check for its usefulness is formed, final reports are made and review of the whole process is done to check any mistake and see if any step is repeated.

SEMMA makes it easy to apply exploratory statistical and visualization techniques, select and transform the significant predicted variables, create a model using the variables to come out with the result, and check its accuracy. SEMMA is also driven by a highly iterative cycle.

The data mining process is divided into two parts i.e. Data Preprocessing and Data Mining. Data Preprocessing involves data cleaning, data integration, data reduction, and data transformation. The data mining part performs data mining, pattern evaluation and knowledge representation of data.

There are many factors that determine the usefulness of data such as accuracy, completeness, consistency, timeliness. The data has to quality if it satisfies the intended purpose. Thus preprocessing is crucial in the data mining process. The major steps involved in data preprocessing are explained below.

Binning is done by smoothing by bin i.e. each bin is replaced by the mean of the bin. Smoothing by a median, where each bin value is replaced by a bin median. Smoothing by bin boundaries i.e. The minimum and maximum values in the bin are bin boundaries and each bin value is replaced by the closest boundary value.

When multiple heterogeneous data sources such as databases, data cubes or files are combined for analysis, this process is called data integration. This can help in improving the accuracy and speed of the data mining process.

Different databases have different naming conventions of variables, by causing redundancies in the databases. Additional Data Cleaning can be performed to remove the redundancies and inconsistencies from the data integration without affecting the reliability of data.

This technique is applied to obtain relevant data for analysis from the collection of data. The size of the representation is much smaller in volume while maintaining integrity. Data Reduction is performed using methods such as Naive Bayes, Decision Trees, Neural network, etc.

In this process, data is transformed into a form suitable for the data mining process. Data is consolidated so that the mining process is more efficient and the patterns are easier to understand. Data Transformation involves Data Mapping and code generation process.

Data Mining is a process to identify interesting patterns and knowledge from a large amount of data. In these steps, intelligent patterns are applied to extract the data patterns. The data is represented in the form of patterns and models are structured using classification and clustering techniques.

This step involves identifying interesting patterns representing the knowledge based on interestingness measures. Data summarization and visualization methods are used to make the data understandable by the user.

Relational Database management systems such as Oracle support Data mining using CRISP-DM. The facilities of the Oracle database are useful in data preparation and understanding. Oracle supports data mining through java interface, PL/SQL interface, automated data mining, SQL functions, and graphical user interfaces.

#1) Financial Data Analysis: Data Mining is widely used in banking, investment, credit services, mortgage, automobile loans, and insurance & stock investment services. The data collected from these sources is complete, reliable and is of high quality. This facilitates systematic data analysis and data mining.

#2) Retail and Telecommunication Industries: Retail Sector collects huge amounts of data on sales, customer shopping history, goods transportation, consumption, and service. Retail data mining helps to identify customer buying behaviors, customer shopping patterns, and trends, improve the quality of customer service, better customer retention, and satisfaction.

#3) Science and Engineering: Data mining computer science and engineering can help to monitor system status, improve system performance, isolate software bugs, detect software plagiarism, and recognize system malfunctions.

#4) Intrusion Detection and Prevention: Intrusion is defined as any set of actions that threaten the integrity, confidentiality or availability of network resources. Data mining methods can help in intrusion detection and prevention system to enhance its performance.

Data Mining is an iterative process where the mining process can be refined, and new data can be integrated to get more efficient results. Data Mining meets the requirement of effective, scalable and flexible data analysis.

About us | Contact us | Advertise | Testing Services All articles are copyrighted and can not be reproduced without permission. Copyright SoftwareTestingHelp 2021 Read our Copyright Policy | Privacy Policy | Terms | Cookie Policy | Affiliate Disclaimer | Link to Us