• No results found

fully connected recurrent neural networks

Stable Adaptive Neural Control of a Robot Arm

Stable Adaptive Neural Control of a Robot Arm

... Indirect Neural net- work Controller (IDNC) composed of two separate fully connected recurrent neural networks: the Neural Con- troller (NC) and the Adaptive instantaneous ...

6

Hopfield Neural Networks for Aircrafts’ Enroute
Sectoring: KRISHAN-HOPES

Hopfield Neural Networks for Aircrafts’ Enroute Sectoring: KRISHAN-HOPES

... artificial neural networks are biologically ...artificial neural networks perform computational tasks by modeling the human brain ...the neural networks are divided in two ...

8

Image Captioning with Recurrent Neural Networks

Image Captioning with Recurrent Neural Networks

... Feed-forward neural nets together with backpropagation algorithm have showed very useful for range of tasks and it has been even proven [9, 27] they can approximate any continuous ...is connected to large ...

58

Unified Framework For Deep Learning Based Text Classification

Unified Framework For Deep Learning Based Text Classification

... artificial neural networks, which are inspired by biological brain model made of ...convolutional neural network (CNN), deep belief networks, recurrent neural networks ...

5

GROUP OF RECURRENT NEURAL NETWORKS

GROUP OF RECURRENT NEURAL NETWORKS

... iii FORP(Flow Oriented Routing Protocol): FORP[1][3][5] is an on demand routing protocol and it is based on pure flooding mechanism. Moreover it maintains prediction based multi-hop handoff mechanism. This attempt is ...

9

GROUP OF RECURRENT NEURAL NETWORKS

GROUP OF RECURRENT NEURAL NETWORKS

... We can define open source software as “software which source code is available to its users for modification, use, and redistribution” [1].Open source software’s code is fully available for different purposes i.e. ...

7

GROUP OF RECURRENT NEURAL NETWORKS

GROUP OF RECURRENT NEURAL NETWORKS

... using Neural Network has been illustrated by ...two neural network models for comparing the ...propagation neural model and Feed forward back propagation network model have been found to be better ...

7

GROUP OF RECURRENT NEURAL NETWORKS

GROUP OF RECURRENT NEURAL NETWORKS

... power loss, voltage regulation and fault currents for uniformly distributed loads only. Although the impacts of single DG on different aspects of operation are reviewed, uniformly distributed loads are not common in ...

26

GROUP OF RECURRENT NEURAL NETWORKS

GROUP OF RECURRENT NEURAL NETWORKS

... is connected to a coupling transformer that is connected directly to the ac bus whose voltage is to be ...is connected is maintained at the desired reference ...

9

GROUP OF RECURRENT NEURAL NETWORKS

GROUP OF RECURRENT NEURAL NETWORKS

... Jenq-Neng et al (1996) have provided an analysis to show that the maximum correlation training criterion used in cascade-correlation learning tends to produce hidden units that saturate and thus makes it more suitable ...

20

Predicting Hurricane Trajectories Using a Recurrent Neural Network

Predicting Hurricane Trajectories Using a Recurrent Neural Network

... As was stated in the Related Work section, the sparse Re- current Neural Network with a flexible topology was trained and tested using the Unisys Weather dataset. This method, although presented impressive ...

8

Dynamic Multicast Transmission Packets Control in DTNs Using Mobile Adhoc Networks

Dynamic Multicast Transmission Packets Control in DTNs Using Mobile Adhoc Networks

... Tolerant Networks (DTNs), also called as occasionally connected mobile networks, are wireless networks in which a fully connected path from source to destination is unlikely to ...

6

GROUP OF RECURRENT NEURAL NETWORKS

GROUP OF RECURRENT NEURAL NETWORKS

... Mr. Neeraj sahu received the degree M.S, M.Phil in Mathematics from Bundelkhand University Jhansi in 1999 and 2008 respectively. He is a Ph.D student of Jiwaji University Gwalior. His research interests are Neural ...

8

Short term load forecasting based on hybrid artificial neural networks and particle swarm optimisation

Short term load forecasting based on hybrid artificial neural networks and particle swarm optimisation

... tested neural networks, one can see that the performance of the ERNN models was not satisfactory in this ...these networks, although they drew a similar shape, were consistently lower than the actual ...

97

THE prediction of an image region from a set of pixels

THE prediction of an image region from a set of pixels

... deep neural networks for intra prediction cannot currently be trained in reasonable time via the proposed iterative training with ...the neural network architectures for faster inference [47], [48] ...

15

Convolutional Neural Network Language Models

Convolutional Neural Network Language Models

... Convolutional Neural Networks (CNNs) have shown to yield very strong results in several Computer Vision ...for recurrent models, our model outperforms RNNs but is below state of the art LSTM ...

10

Sentiment Classification Via Recurrent Convolutional Neural Networks

Sentiment Classification Via Recurrent Convolutional Neural Networks

... the Recurrent Neural Network ...Convolutional Neural Network (CNN) for sentiment ...or recurrent neural networks, CNN may be more beneficial to the process of capturing text ...

9

Stability Criteria Of Fully Connected Hopfield Artificial Neural Network

Stability Criteria Of Fully Connected Hopfield Artificial Neural Network

... Artificial Neural Network (HANN). Hopfield Neural Network is a multiple loop feedback neural network which can be used as an associative ...are connected to every other neuron but without any ...

6

Orthogonal Recurrent Neural Networks and Batch Normalization in Deep Neural Networks

Orthogonal Recurrent Neural Networks and Batch Normalization in Deep Neural Networks

... these networks to prevent overfitting, but simply optimized over their learning rates to obtain the highest test accuracy ...other networks, indicating how batch normalization can decrease training ...

91

Closing Brackets with Recurrent Neural Networks

Closing Brackets with Recurrent Neural Networks

... activation function. At each time step the input vector is built by concatenating the vector of the current word and the output produced by the hid- den layer during the previous time step. The next word is predicted by ...

8

Show all 10000 documents...

Related subjects