A Double linked list is a linked data structure, it built like set of nodes and each node contain three portions (one Data portion & two Node Reference portions). Almost all Node References portion may connected to another node if it really had neighbor node or otherwise Reference portion marked as NULL.
This project is developed based on the console user interface (CUI). It has own scanning algorithm for finding malicious code in each file during the scan. The core idea is to search/match the virus signatures in all scan files or directory. Usually, 90% of viruses/worm having own signature (Some repeat text founded in all affected binary files or archive files) and remain 10% Viruses are identified based on execution behavior.
The web crawler is a computer program which used to collect/crawling the following key values(HREF links, Image links, Metadata.etc) from a given website URL. It is designed like intelligent to follow different HREF links which are already fetched from the previous URL, so in this way, Crawler can jump from one website to other websites. Usually, it called a Web spider or Web Bot. This mechanism always acts as the backbone of the Web search engine.
A Linked list is a linked data structure, it built like set of nodes and each node contain two portions (Data portion & Node Reference portion).Data is always stored in Data portion (Maybe primitive data types eg Int, Float .etc or we can store user-defined data type also eg. Object reference) and similiarly Node Reference portion may connected to another node if it really had neighbor node or otherwise Reference portion marked as NULL.