Big Data and Cyber Crimes
In a universe full of high-technology, software is now available to predict future crimes, generate virtual “most wanted lists, and collect personal identifiable information. This new innovation of technology is called Big Data. Big data is an evolving term that describes any voluminous amount of structured, semi -structured, and unstructured data that has the potential to be mined for information (Bigelow). Big data generates value from large data cells that cannot be analyzed with traditional computing techniques. The quantity of computer generated data around the world is growing dramatically. Given the technology, people, skills, and resources available, big data can be complex and difficult to work with. It is referred to in many contexts, such as science, medicine, meteorology, politics, advertising, and financing. It can even be used to predict and solve crime. As a society, we constantly produce data. For example, retailers build databases to record customer activity, organizations working in logistics, financial services, healthcare, law enforcement agencies, and many other sectors are also capturing data. Lastly, public social media creates fast quantities of digital material. This research paper will reflect on the rise of big data analytics and how combative its use can be to policing.
Big data is often characterized by 5Vs: volume, variety, velocity, veracity, and value. (Gewirtz). Volume refers to the massive amounts of data that are collected, generated, and stored through websites and other online applications. Big data is large, but there is no set definition of how big it can be. In the emerging field of science, the data that is processed and analyzed is so large that storage can be a challenge. It poses both the greatest challenge and the greatest opportunity as it helps many organizations to understand people better and to allocate resources more effectively. Variety represents the different types of information that can be defined as big data. Big data can contain large amounts of information, but there are different sources of big data available such as social media (tweets, video content, or pictures) and healthcare data. This can also apply to the different structures of data. Velocity represents the speed in which big data can be collected, generated, and analyzed at a rapid pace from different organizations. Time plays a big role in the velocity of big data. Technological advances mean that a number of data sets are generated and can potentially be analyzed in time. Much of this data is generated by the increase of digital devices that we have such as cell phones, and tablets that produce data by our activity usage. Google and Facebook organizations use data science to analyze data in time to make decisions and serve live content in response to the data that is being produced. Veracity refers to data that has been collected for study purposes. It is data generated from activities, time measurements, or records created by organizations as they carry out their research. The data can be generated in different ways displaying errors, duplicates, abbreviations, and its own unique way of being coded and organized. The fifth V, value, adds to the justification of working with big data. It is critical to know that value is extracted from big data. Big data technologies allows organizations to analyze the vast amount of data and gain valuable information out of it. Quality is always preferred over quantity in big data. Data defines the quantities, characteristics, or symbols of operations that are performed via computer. The 5 V’s describe all the characteristics of big data, but it also helps us think about the different types of big data that exist.
There are 3 classifications of big data: structured, unstructured, and semi-structured (Davies & Bose). Structured data is data that has a proper format to it. It could include a person’s health and education records, income and pension records or any data collected for the purpose of a business or organization providing a service for the use of research and analysis. For example, data that is present in databases, CSV files, and excel spreadsheets can be referred to as structured data. Structured data has the advantage of being accessible, easily entered, saved, processed, and analyzed. Unstructured data can be both human, and machine generated. Machine generated data would include satellite images: weather data, and google earth, scientific data, from natural disasters such as earthquakes, and photographs and videos from CCTV or traffic monitors. Examples of human generated data would include textual data: essays, reports, or notes from doctors, data from social media activities, notifications, and conversations. Lastly, it would include website content which could include pictures, or video content. Semi-structured data is data that does not have a proper format to it. Data that is present in emails, log files, and word documents can be referred to as semi-structured data. Among all the advantages of big data there are some challenges. There are significant challenges in preparing data for analysis. These challenges are particularly common when there is a large amount of data sets to work with. The understanding of the structure of data, how they are coded, how accurate the information is, what data is missing, and what the different terms and value in the data mean are all very important. If the databases are too large storage and processing power is limited. When linking data sets ethical issues of privacy, confidentiality, and anonymity affect the data that is being held. Over the years, big data has added so much value to many organizations. Historically, data was generated and accumulated by employees of companies who were entering data into computer systems. With the evolution of computers, internet, and other technological advances, users and machines are now able to generate their own data. This progression has helped many businesses, and organizations. The unstructured and structured data of web searches, emails, transactions, blog posts, social media feeds, and data streams from smart devices has become an information overload. The ability to analyze and capture the wealth of information available has defined many companies, organizations and agencies. Big data insights can improve patient diagnostics in hospitals, enable retailers to deliver customer service experiences, allow banks to detect fraudulent activities before they happen, and it is used to help law enforcement predict and solve crime. Working on a police force, officers will network with computer screens blinking alive with crisis, inner city digital maps alerting 911 calls, television screens tracking breaking news stories, surveillance cameras monitoring the streets- all linking analysts and patrol officers to a wealth of law enforcement intelligence. This is the high tech command for the future of policing around the world.
Big data is said to make an investigator’s job easier and fighting crime faster. Law enforcement agencies use two different methods with the help of big data: predictive analytics enabling them to predict a percentage of crimes that are going to happen in a given time frame or a specific location, and investigative analytics which gives a link analysis of people, and places. With a shrinking method, this may take days, hours, or even minutes to collect. Big data and the software that law enforcement use accesses the data, crunches the numbers, and does predictions making it publicly available. In essence, the use of big data to predict crime has become very effective due to today’s different devices that have their own network. Officers are able to use this network to see the history of what has been going on, and detect what is about to happen next. Among the numerous ways big data can be applied to policing, there are four key priorities: predictive crime mapping, predictive analytics, advanced analytics, and big data technology (Babuta). Predictive crime mapping is used by analysts in law enforcement agencies to map, visualize, and analyze crime patterns. Today’s crime maps are generated by computers and fed by detailed crime data to predict where and when crime will happen. Police departments routinely scan geographical locations for hotspots, or places experiencing a significant amount of criminal activity. Beginning with the data of previous crimes, computer programs create a projection for each location, estimating the risk of future crimes that may occur. With the help of these maps, many departments place officers at these specific locations to prevent future crime. Crime mapping is used to understand patterns of crime prevention, crime reduction, and the causes of crime. This crime mapping and hotspot analysis has redefined predictive policing. The next key priority for the use of big data in policing is predictive analytics. Predictive analytics is a process that consists of three parts: capture, predict, and act. Capture is the first step in the analytics process. It refers to the collection of relational databases, excel spreadsheets, previous transactions and other big data sources. The second part in the analytics process is predicting. Once data is collected, the next phase is to make predictions using various data techniques including data mining, text mining, and statistical analysis. Finally, after making predictions, you must act upon those predictions. Refusing to act upon insight is pointless, and causes unintended results. Predictive analytics is used to identify the risk associated with particular individuals. This includes their opportunity of recommitting crimes, going missing, or becoming victims of certain crimes. The third key priority in predictive policing is advanced analytics. Advanced analytics enables the police to harness the full potential of data collected through visual surveillance such as CCTV images and automatic number plate recognition (ANPR) data (Babuta). Lastly, open-sourced data of big data technology that can be collected from social media to gain a better understanding of specific crime problems, and preventative techniques for police to solve these problems.
Today, law enforcement gathers information from a large range of sources. Video cameras monitor buildings and streets, gunshot detectors alert police, automatic license plate detectors find wanted vehicles, and biometrics and facial recognition systems identify criminals. In addition to all this, law enforcement agencies have access to vast data shared around the world. All of this data is valuable, but only if agencies can capture and evaluate it to help prevent and solve crime. With the help of the Crime Management Center and other software law enforcement agencies use data that can be integrated and analyzed regardless of its source. These software systems are specifically designed to support the law enforcement process of preventing, detecting, responding to, and solving crime. The Crime Management Center enables police to prevent crime, while collecting, integrating and analyzing data from multiple sources to provide a prediction as to where and when crime is likely to happen. When agencies know where crime usually takes place, they are able to establish patrol schedules deploying officers to areas where they are most needed. This effort can reduce response time by getting officers where they need to be more quickly. The Crime Management Center assists law enforcement agencies to detect crime through software such as Palantir, PredPol, CompStat, Hunchlab and Wynard. These software systems collect and aggregate information to provide automated alerts in command centers. This allows law enforcement agencies to respond to crime by inputting incident information such as crime history, known felons in the area of a call, enhanced 911 data, live video feeds and addresses in the hands of patrol units. With all this information at the tip of their fingers, patrol units are able to react and respond to incidents more quickly and safely. When responding to such incidents officers need to be able to access any information that might help them. The various software systems can help with this assistance.
Palantir is an innovative mobile prototype used for intelligence-led policing platforms. It is an important investigative crime-fighting tool for solving problems and integrating data. Palantir solutions brings many databases into one system. Currently, it brings in crime and arrest report information, field interviews, automated license plate reader information, DMV information, and rap sheets. Instead of logging into multiple systems, Palantir enables users to conduct a particular search for a suspect, target, and location through a single portal and return that data to its original system. For law enforcement agencies on the federal, local, and state levels, Palantir equips officers and agents with the tools they need to easily analyze intelligence securely, collaborate on investigations, manage cases, produce reports, and respond to crime as it happens (Palantir). PredPol is another software algorithm that uses big data to predict where crimes are most likely to happen. With the support of the National Science Foundation, a team of researchers in Los Angeles, California developed this algorithm by looking at crime data, and saw that it fit predictable mathematical patterns. Software was then built around those patterns. PredPol is based on seismic software: it looks at crime in one area, incorporates it into historical patterns, and predicts when and where it might occur next (O’neil, p. 85). Blind to race and ethnicity, PredPol targets geographical locations, and not the individual. Once the system is set up, it enables officers to focus on violent crimes committed such as homicide, arson, assault, vagrancy, aggressive panhandling, and the selling and consuming of illicit drugs. Many police departments around the country have had a significant amount of success in reducing crime using PredPol. The third software algorithm that the country’s largest law enforcement agency is using is CompStat. CompStat was revolutionized in the 1990s by New York police commissioner William J. Bratton. It is a system used to track and map crime in the city combined with smart management, targeted enforcement, and directive accountability for commanders of the city’s 77 precincts (nbcnews). Usually suspect names and hotspot locations are shared behind closed doors, but CompStat gives officers a roadway to combating crime. Commissioner Bratton and his team reinvigorated CompStat by adding new tactics and applying intensive analysis techniques to individual cases and crime patterns. With the use of CompStat, the NYPD was able to reduce traffic stops, arrests, and crime all together. The crime statistics reflect these results, people feel safer and more comfortable around the city, and tourism has increased dramatically. HunchLab is another algorithm police use to combat criminal activity. HunchLab, produced by Philadelphia-based startup Azavea, represents the newest iteration of predictive policing, a method of analyzing crime data and identifying patterns that may repeat in the future (Chammah). It provides an alignment of patrol activities with priorities to the local neighborhoods and communities, it allocates resources to prevent over-policing, and it determines which tactics work and which do not. Lastly, Wynyard is a global marketing crime fighting software providing services through advanced analytics and investigative case management products. Wynyard allows agencies to attack the world’s most serious crime problems. Intelligence analysts, investigators across government security, law enforcement, financial services, and critical infrastructure use Wynyard to identify persons of interest, detect fraud and money laundering, prevent high consequence cybercrime, investigate national crime, and encounter new generation extremism. Rapidly servicing and exploring hidden entities, connections, and events are critical to preventing and solving serious crime. As technological advances continue to evolve, more cities and law enforcement agencies will be able to upgrade their systems and take heed to predictive policing.
Given the information about big data analytics, algorithms, and predicting where and when crimes are likely to occur raises fundamental questions about how it affects policing. Predictive policing is definitely going to be a law enforcement tool of the future, but is there a risk of relying too heavily on an algorithm? There are both positive and negative arguments of big data’s usage in policing. The questions of democracy, privacy, and individual freedom raises issues when the topic of predictive policing is approached. Many people consider these methods to be controversial because of security issues, but it is crime data and arrest records that are already available, but not easily accessible in police databases. However, every technological machine has their faults, so something can potentially backfire and seriously hurt someone, or damage a case or procedure. Existing law enforcement databased technology should be up to date and networked properly. Deploying sufficient patrol resources to effectively make predictions require planned initial use to ensure crime prevention and other benefits. When solving crimes, every second counts. This technology can significantly decrease the time it takes to catch a suspect, make an arrest, and save a life. Predictive policing can genuinely make the difference between life and death situations. All of these law enforcement gadgets are convenient to the people using them. Law enforcement officers have the ability to solve crimes and make arrests more easily while protecting citizens due to these advances in technology. The list of technology that aids law enforcement today is endless. From flying drones to predictive analytics software and handheld fingerprints scanners, technology has change the profession of law enforcement to an evolving opportunity for those that are interested in becoming an officer of the law.
Today, data can be generated anytime, anywhere. The data that gets generated whether through smart watches, online internet activity, geographic location movements, or even credit card swipes is only used for one thing- to profile and understand the individual consumer using it. Different from traditional police methods, the use of big data to make predictions in policing is changing our relationship with the law. When thinking about big data’s use to solve crime, one would consider it being something from a Tom Cruise, or Robocop movie. This software system is attempting to forecast the highest risk times and places for future crimes. Many cities have seen a significant decrease in crime statistics because of their use of these algorithms. Officers already have most of the information they need at their disposal, but this particular technology enables officers to solve crime much quicker and have accessible on-hand information to react more proactively. There are still a lot of questions to consider about the overall efficacy of these programs, but it also makes complete sense. An officer’s job is not all about arrest or chasing the bad guys. With the help of algorithms, crime data is analyzed, patterns are spotted, and finding the location to send patrol is much easier. It also helps to cut down the cities’ crime by stopping it before it happens. Big data’s usage should be left up to the officer’s discretion to use whatever tools needed to keep the world a safer place.
Hire a verified expert to write you a 100% Plagiarism-Free paper