A sandbox is a play area (or more properly, a staging area) where tests can be conducted before a system goes live. PayPal, for example, offers webmasters a sandbox area where they can test the integration of PayPal into their web sites before taking real orders through PayPal.
A search engine operates like a directory for the World Wide Web. A search engine is a web site that attempts to index (store) all or most of the Web in a database. When a user visits the search engine and conducts a search, the search engine returns results from its database, together with links to the actual sites where the information was collected from. Search engines use proprietary algorithms (formulas) to determine which web pages are most relevant to a user's search and displays the most relevant results first.
The act of creating or changing web pages so that their ranking in search engine results pages improve. Search engine optimization (SEO) is not quite the same as search engine marketing (SEM). The latter is a collective name for all search engine marketing activities including SEO, PPC marketing, link building etc.
Largely a myth propagated by companies that charge a fee for search engine submission. In the early days of the web it was necessary to "tell" search engines about the existence of your web sites so that they could send visitors your way. Search engines had - and many still have - submission forms where users could add their sites. Nowadays though, search engines are so good at spidering the Web that they are likely to find your site very soon after it is published, making search engine submission redundant.
Used in a Web form, a select box lets the user choose a value from a pre-defined list, rather than letting the user type in a box. It is handy for form fields like country names where a pre-defined list offers consistency. If the webmaster allowed users to type the country names themselves, some would type "America", some "USA", some "United States" etc.
The vast majority of sites on the Web are hosted on shared hosting servers (as opposed to dedicated servers), where more than one web site "lives" on one server. It's a great concept, because the cost of the server is shared, but it requires careful management of resources to ensure that no single web site on the server uses up more than its share of resources.
Like its real-world counterpart, a web site shopping cart gives users somewhere to "put" the merchandise that they want to purchase from the site. When they have collected all the goodies they want to purchase, they click a "checkout" link to go to the payment section of the site. Shopping carts are typically built in PHP and use a database or cookies to store user information.
A sitemap is a page within a web site that offers links to all the pages within that web site - or, at least, to the important pages within the web site. It is a great tool to offer your web site visitors and it helps search engine spiders to quickly find all of the internal pages on your web site, including pages that they might otherwise miss.
Social ads are small advertisements displayed on social media sites. They differ from other ads in that the type of ad displayed to you is determined by your shopping behaviour (or that of your friends) on the social media site.
A spambot operates much like a spider, but its mission is usually to collect email addresses for spam email campaigns. Webmasters often see a sharp increase in spam email received once they publish an email address on a site.
A good remedy is to use forms instead and to include a CAPTCHA in the form.
A black hat SEO technique that relies on search engines indexing multiple pages that are almost identical. Like all spam techniques it was effective early on, but search engines quickly caught up and started penalizing or at least ignoring these pages.
In the context of web design, a spider is not an eight-legged creepy crawly. It's a program that surfs the Internet, jumping from one page to the next by following links just like a human would. The spider's job is to store ("index") all the information it finds. A search engine like Google makes use of spiders to collect information from and about web sites to facilitate its web search.
Server side includes are small pieces of code that are saved as separate files which can then be included in the pages of a site. For example: The header section of a site is typically handled as a SSI. The header is built as a separate file and included at the top of every page of the site. This makes managing the site a snap, because changes to the header now means changing just one file rather than updating every individual page on the site.
Web site content that give users an incentive to revisit often. Usually this requires some sort of investment from the visitor. For example, Q and A sites often have a scoring system for answers. If a user answers many tech questions and her answers are often tagged as the best answer, she can eventually become seen as an expert in her field.
A subdomain is a third-level domain that is often used to organize information categories on a web site, often used in the same way as directories. For example, "www.example.com/blog" could be replaced with "blog.example.com". In this example, "blog" would be the subdomain.