An artificialintelligence technique based on five (5) layered feedforward backpropagation algorithm is applied in this study for technicalscreening of enhancedoilrecovery (EOR) methods. Explicit knowledge pattern associated with the field data are extracted by taking advantage of the robustness of fuzzy logic reasoning and learning capability of neural networks. Associated field data from successful EOR projects include parameters such as depth, porosity, permeability, viscosity, oil API and oil saturation. These parameters were used as input and predicted output in the training and validation processes, respectively. The developed model was then tested by using data set from Block T of the Angolan oilfield. Sensitivity analysis was performed between the Mandani and the Takagi Sugero (TSK) model approach incorporated in the algorithm. The results of the sensitivity analysis have shown the robustness of the ANFIS approach in comparison to other approaches for the prediction of suitable EOR technique. Five non- regression models (linear, potential, logarithm, power and polynomial) were applied to evaluate the accuracy of the model between the trained and the tested data set. The results of simulation show that hydrocarbon gas, polymer, combustion and CO 2 are the suitable EOR techniques and could be used for further experimental and numerical studies.
Reservoir engineers should have in-depth knowledge about reservoir characteristics to conduct reservoir simulation, laboratory tests and pilot tests in order to implement a plan of designed scenario into real reservoir under control. Different screening criteria and screening tables are suggested in verified papers so that reservoir engineer is able to use these criteria in order to reduce number of case studies. There are some major issues in utilizing of these tables such as overlap between suggested criteria. This problem leads to have difficulty for an engineer assessing the most suitable EOR process and in addition, there are no economic evaluation for implementation of methods. Therefore, an appropriate tool is required for primary screening the suitable recovery method as well as economic evaluation of different candidate scenarios prior to any operations being performed. In this research, two different screening tools are presented through a graphical user interface. Both screening tools are designed based on development of artificial neural network architecture. The first screening tool is based on updated criteria proposed by Al Adasani and Bai , so efficient networks developed for classification of EOR methods via miscible and immiscible injection of carbon dioxide, nitrogen and hydrocarbon, while the other, Neuro-Simulation technique used for prediction of reservoir performance and economic evaluation for injection of CO2, N2 and natural gas into the reservoir. Neuro-simulation is a hybrid synergistic technique that couples soft-computing and hard-computing techniques . In part of hard-computing, reservoir models were built and run using CMG simulation software. In part of soft computing, different neural networks were developed by MATLAB® software.
Naturally, oil is produced from reservoir by its potential energy or pressure gradient between surface and subsurface condition, when this energy reduced, attention has been focused on the EnhancedOilRecovery (EOR) techniques for recovery more oil from the existing and abandoned oil fields. The EOR methods may be divided to three categories: thermal, chemical and gas injection methods. Thermal method will be applied for heat needs of reservoirs. This method includes steam or hot water injection and in situ combustion technique. Chemical flooding involves injection of certain chemicals that might change either the characteristics of the reservoir fluids or improve the oilrecovery mechanisms. These include polymer, surfactants, alkaline flooding and miscible flooding (either first-or multi –contact miscible)
difficulties with existing accounting database systems. The needs of decision makers are not met by accounting information. Humans do not understand or cannot process the computerized accounting databases. Systems are not easy to use. There is focus on the numeric data. Integrating intelligent systems with accounting databases can assist (either with the decision maker or independent of decision maker) in the investigation of large volumes of data with or without direct participation of the decision maker. Thus, the systems can analyze the data and assist the users understanding or interpreting transactions to determine what accounting events are captured by the system .With the artificialintelligence we store and retrieve knowledge in human language. There are some artificialintelligence tools or techniques that help in the huge understanding of events captured by the accounting system. There is more emphasis on symbolic or text data rather than just numeric data to capture context. The artificialintelligence and expert system builds intelligence into the database to assist users. Without users direct participation such models help the users by sorting through large quantities of data. Such models also assist the decision makers under time constraints; suggest alternatives in the searching and evaluation of data.
AI shows us that it’s possible to emulate certain aspects of intelligence simply by feeding data into a computer and using relatively simple algorithms to find patterns that can then be used to make predictions a process called machine learning. The most sophisticated AI algorithms use a machine-learning approach called “deep learning”, with multiple layers of artificial neural networks mathematical models loosely based on how real neurons are connected. New applications are emerging every day, from detecting disease to crafting pick-up lines . Yet AI is still no match for the human intellect when it comes to “human” things such as creativity and common sense. Even deep learning remains far from achieving the holy grail of artificial “general intelligence”, defined as learning and reasoning abilities across multiple domains from medicine to politics, say and with broad autonomy. AI remains limited to specific tasks and relies on a lot of carefully curated data, as well as computer programming, to optimize how its algorithms execute the task at hand. So called neuromorphic computer architectures are designed to help curb AI’s insatiable appetite for computer power, with chips that, like the brain, integrate memory and processing, and are less power-hungry than conventional chips .
injected; oil saturation cannot be reduced below ROS. In polymer flooding, small quantity of polymer is added to water giving rise to higher viscosity to water; all other characteristics of water remaining intact. As such, polymer flooding or modified water flooding cannot reduce ROS in the oil reservoir. However, depending upon the type of polymer used, there may be certain changes in the characteristics of porous medium after coming in contact of the polymer solution. In case of polyacrylamide polymer, there is appreciable decrease in permeability of the porous medium to water due to adsorption of polymer in the pore channels. This is almost an irreversible phenomenon. Such behavior is not found in case of other types of polymers such as biopolymers.
The model above collects information regarding transactions, which illustrates the application of Big Data in this case. Analysis of the big data using AI and ML algorithms provide insights are applicable in strategic (long-term) and operation (short term) forecasting. Strategic planning assists in determining the budge that a company could allocate a given venture depending on the impact it could have, particularly in facilitating fulfilment of goals and achievement of objectives. It also helps in creating a time frame for the financial cycle. The case of Uber also gives more insights into how forecasting helps in enhancing operations and insights. Operations basically involve following the strategic planning put in place. This implies that the set target has to be achieved. A relatively short time frame is used in operations planning, which lasts little lesser time and has repeated cycles on spending on budgets. On insights, the business performance is evaluated with reevaluations and adjustments on the budget target which are necessary in making recommendations for improving financial forecasting. There are various key components of AI solutions which require to be efficiently and effectively implemented. They include the fact that AI is implemented as an independent domain, it is a data science, and AI is digital . Therefore, AI algorithms critically identify and maps the factors that materials bring about in bearing on revenue. AI is needed for enhancement of core market drivers which include sales projections, marketing and promotions, price variability, incentive programs, and competitor activities .
ArtificialIntelligence (AI) is the cognitive ability of a computer or machine to think and learn. It is imperative to evaluate the extent to which the ArtificialIntelligence (AI) can integrate Emotional Intelligence (EI) which is being posited as a facilitator of enhancing millennial engagement. This study is undertaken with two objectives: 1) exploring the extent of use of ArtificialIntelligence (AI) in financial services and 2) examining AI‟s role in enhancing a student‟s experience in learning activities. The methodology of the study is mainly exploratory. The study brings forth certain significant implications of the future of skill development and the ethical issues that require deliberation for purposes of legislation and drafting of public protection policies.
in order to ensure that they behave as passive tracers during EOR operations. This is not obvious, given the large concentrations or EOR-chemicals used in most EOR-operations. Labelling of individual components, by radioactive isotopes, gives unique possibilities to follow special organic chemicals used in EOR processes. Although possibly too expensive to be applied in full-field tests, these methods are certainly well suited for laboratory tests and perhaps even in small scale pilot studies. We also illustrated the advantages of solving tracer transport separately from the fluid flow. Separate solution of the tracer problem gives an opportunity to state and solve the tracer problem from pre-solved and stored flow solutions, saving significant CPU-time and allowing for accurate solutions through separate tracer grid refinement.
During early production of oil from reservoir, oil can pass through small capillaries which are created natural underneath the earth crust oil can also be extracted with the tiny pores below the earth surface. In initial stages of oil extraction, the pressure of oil is so high that no external pressure is needed for its extraction. This process of recovery of oil is called primary extraction of oil. This depend on various characteristics of the oilrecovery site as well as the property of hydrocarbon fluid which is extracted.in some of the oilrecovery sites which are located in oceans system, so the pressure generally depends on aquifer driven force which pushes and make oil move. During initial stages the pressure is high which helps the oil to come out naturally but as the process continues the pressure reduces and after a certain point in recovery of oil we need to apply external pressure with the help of pump which generated presses to keep oil flow continues to make the process economically feasible. There are some other method which can help o increase pressure there are addition of various dissolved gas. When the pressure of the oil reservoir falls below the pressure point which is called bubble point then that soil reservoir starts releasing various gases which comes out in the form of small bubbles. The gas bubble initially is entombed in small pores but gradually these bubble increase which start the formation of dissolved gas drive. If this process continues that the size of bubble constantly increases. As pressure falls this increases the gas cap drive making it more difficult to extract oil as the number of capillaries also increases which divert the path of oil extraction.  B. Secoundary Production Of Oil
Presently, slimtube technique is the most accepted approach in the industry. Slimtube is a small diameter tube (<0.25") with length up to 75 ft, packed with sand or glass beads that represents a one dimensional reservoir (Amao et al., 2012). For controlling the temperature of the slimtube, oven or water bath is normally used. Slimtube is saturated with crude oil and by gas flooding; then, the miscibility conditions are determined by applying different injection pressures. Each pressure of injection corresponds to a recovery factor resulted by 1.2 Pore Volume (PV) of injected gas. Finally, the oilrecovery vs. pressure is plotted and interpretation is conducted to determine the MMP. MMP is determined as the breakthrough point in the recovery vs. pressure plot (Hamdi and Awang, 2014). Accurate estimation of the minimum miscibility pressure is important in conducting numerous simulation runs. MMP is the minimum miscibility pressure which defines whether the displacement mechanism in the reservoir is miscible or immiscible (Farzad and Amani, 2012). Miscible displacement is a process in which the injected and displaced phases mix in all proportions such that they do not form interfaces or two phases. The single- phase condition implies that all resident oil are displaced by the solvent from the pore space that it invades. Some fluids, like propane fulfill this definition, majority of the solvents available for oilfield use when combined with reservoir oils form two distinct phases over a wide range of mixtures and pressures (Mathiasson, 2003). Displacement fluids such as hydrocarbon solvents, CO 2 flue gas or nitrogen could
The use of artificialintelligence is investigated as thebasis to mitigate the problems of accounting databases.The following are some difficulties with existingaccounting database systems.The needs of decision makers are not met byaccounting information. Humans do not understand orcannot process the computerized accounting databases. Systems are not easy to use. There is focus on thenumeric data. Integrating intelligent systems with accountingdatabases can assist (either with the decision maker orindependent of decision maker) in the investigation oflarge volumes of data with or without direct participationof the decision maker. Thus, the systems can analyze thedata and assist the users understanding or interpretingtransactions to determine what accounting events arecaptured by the system .With the artificial intelligencewe store and retrieve knowledge in natural language. There are some artificialintelligence tools or techniquesthat help in the broader understanding of events capturedby the accounting system. There is more emphasis onsymbolic or text data rather than just numeric data tocapture context. The artificialintelligence and expertsystem builds intelligence into the database to assistusers. Without users direct participation such models helpthe users by sorting through large quantities of data. Suchmodels also assist the decision makers under timeconstraints; suggest alternatives in the searching andevaluation of data.
Intrusion Detection System (IDS) is the process of monitoring the events occurring in a computer system or network and analyzing them for signs of intrusion . It is useful not only in detecting successful intrusions, but also in monitoring attempts to break security, which provides important information for timely counter-measures. Basically, IDS can be classified into two types: Misuse Intrusion Detection and Anomaly Intrusion Detection. Traditional protection techniques such as user authentication, data encryption, avoiding programming errors, and firewalls are used as first lines of defense for computer security. These have failed to fully protect networks and systems from increasingly sophisticated attacks and malwares. As a result, intrusion detection systems (IDS) have become an indispensable component of security infrastructure used to detect these threats before they inflict widespread damage. Recently, the use of ArtificialIntelligence (AI) techniques has been employed in different data mining and machine learning classification and prediction modeling schemes. In addition to these, hybrid data mining schemes, hierarchical hybrid intelligent system models, and ensemble learning approaches that combine the base models with other hybrid machine learning paradigms, to maximize the accuracy and minimize both root mean squared errors and computational complexity, have also gained popularity in the literature .
CO 2 has been used within the petroleum industry for a number of years as an agent for secondary and tertiary recovery of hydrocarbons. Injection of CO 2 into oil-bearing reservoirs has resulted in a more efficient sweeping of the remaining oil and boosting the ultimate recovery from older fields. However, determining the path taken by the injected CO 2 has been problematic. Consequently, it is believed that pockets of unswept reservoir often exist and significant quantities of oil have been left behind.
In the development of modern society, many large auto companies began to study driver-less cars. Cars with self-identifying road conditions and automatic driving functions in 007 series movies will become a reality. In the development of the Internet age, the technology of self-driving cars is not only artificialintelligence tech- nology, but also the introduction of new integrated tech- nologies such as automatic control and visual computing to improve the architecture of existing vehicles, with automatic identification, automatic analysis and automatic control. Therefore, in the future development of society, autonomous vehicles will achieve three technological breakthroughs: first, use camera equipment, radar, laser rangefinder to obtain more road condition information; second, use the map to complete automatic vehicle nav- igation; third, the speed of the car and the direction of travel of the car are effectively controlled based on the existing information data. In the future development of self-driving cars, the information exchange and mutual induction between vehicles can effectively coordinate the driving speed and form direction of the vehicle, avoid the collision problem of the vehicle, and provide guarantee for the safety of the self-driving car.
For the first step of the analysis we employ Adaptive Recursive Partitioning (ARP) that has been so far applied mainly to realistic and synthetic 3D region datasets of discrete (binary) voxel values . Some initial results from attempts to apply the technique on real fMRI datasets have been presented in . The main idea of this technique is to treat the initial 3D volume as a hyper rectangle and search for informa- tive regions by partitioning the space into sub-regions. The intelligence of the tool lies in the selectivity of partitioning the hyper rectangles in an adaptive way. Only hyper rectangles that do not exhibit statistically significant discriminative power are selected to be partitioned recursively. More specifically, for each sample, we use the mean of all voxel values belonging to the volume (hyper-rectangle) under considera- tion as a measurement of activation/deactivation level. The adaptive partitioning of the 3D space continues in the following way: A hyper-rectangle is partitioned only if the corresponding attribute does not have a sufficient discriminative power to determine the class of samples. To decide this, we can apply statistical parametric (e.g. t-test ) or non-parametric tests (e.g. Wilcoxon rank sum ). The procedure progresses recursively until all remaining sub-regions are discriminative or a sub- region becomes so small that cannot be further partitioned. For this purpose, we de- fine the maximum number of partitioning steps (depth) that the partitioning can go through. If the splitting criterion is satisfied, the spatial sub-domain (or hyper- rectangle) corresponding to the node of the oct-tree is partitioned into 8 smaller sub- domains. The corresponding tree node becomes the parent of eight children nodes, each corresponding to a subdomain and the new measurements corresponding to the region data in the sub-domains become new candidate attributes. Observe that the proposed method effectively reduces the multiple comparison problem encoun- tered when using voxel-based analysis. The number of times a statistical test is ap- plied is significantly reduced since we selectively deal with groups of voxels (hyper rectangles).
T he advanced optimal control of fusion plasma con- finement is solved by the methods of artificial intelli- gence and drift kinetic equations. It means that fuzzy systems and artificial neural networks should be applied in the process of optimization. On such a way the using of neutral beam injection, ion-cyclotron range of fre- quencies and electron cyclotron current drive can be ana- lysed as actuators together with Vlasov-Poisson-Fokker- Planck equations. We construct an asymptotic solution of the equation of magnetohydrodynamics (MHD) at high Reynolds numbers. One of the quantities that may be considered in this context is the waiting time of an ex- perimental or observational parameter. As numerical tools we can use fuzzy, adaptive and local Monte Carlo meth- ods or particle in cell method via advanced variational calculus.
ArtificialIntelligence (AI) is one of the technologies that has been and continues to be outstandingly useful and efficient in automatic computerized systems to become more and more customized as per human needs. AI is the main technology that is implemented with the aim of making machines intelligent and be able to mimic humans in actions as well as in doing various things. Typically, computers systems have greater capabilities of gathering and analyzing data; far more than humans could. Combining ArtificialIntelligence with other technologies such as machine learning, which has algorithms that learn patterns and are able to repeat actions represented by the patterns. In forecasting, for instance, AI and machine learning analyze the patterns in the market and establish a systematic trend in it, which could be applicable even in the future.  views forecasting as a science since it is characterized by scientific goals, and it involves establishing sets of data that are analyzable with outcomes being either optimistic or pessimistic.
Artificialintelligence (AI) is a group of techniques that have quite a potential to be applied to pavement engineering and management. In this study, we developed a practical, flexible and out of the box approach to apply genetic algorithms to optimizing the budget allocation and the road maintenance strategy selection for a road network. The aim is to provide an alternative to existing software and better fit the requirements of an important number of pavement managers. To meet the objectives, a new indicator, named Road Global Value Index (RGVI), was created to contemplate the pavement condition, the traffic and the economic and political importance for each and every road section. This paper describes the approach and its components by an example confirming that genetic algorithms are very effective for the intended purpose.