এই পৃষ্ঠাত ৱিকিপিডিয়াৰ চফ্টৱেৰ আৰু হাৰ্ডৱৈৰকে ধৰি কাৰিকৰী বিভাগত সততে উদিত কিছুমান প্ৰশ্নৰ উত্তৰ দিয়া হৈছে।
টোকা: যদি আপুনি ৱিকিপিডিয়াৰ কাৰিকৰী সমস্যাৰ সৈতে জড়িত কিছুমান প্ৰশ্নৰ উত্তৰ ইয়াত পোৱা নাই তেন্তে ৰাইজৰ চ'ৰাত উত্থাপিত কৰিব পাৰে।
- When the second person (and later persons) attempts to save the page, MediaWiki will attempt to merge their changes into the current version of the text. If the merge fails then the user will receive an "edit conflict" message, and the opportunity to merge their changes manually. If multiple consecutive conflicts are noticed, it will generate a slightly different message. This is similar to Concurrent Versions System (CVS), a widely used software version management system.
- If you entered your e-mail address when you signed up, you can have a new password generated. Click on the "Log in" link in the upper-right corner. Enter your user name, and click the button near the bottom of the page called "Mail me a new password". You should receive an e-mail message with a new random password; you can use it to log in, go to your preferences, and change your password to something you'll remember.
- You can change your password via Special:ChangePassword; you can also find a link to this in your preferences.
- The developers use MediaZilla to keep track of bugs. For more information, see Bug reports.
- To make an official feature request, use MediaZilla. For information on using MediaZilla, please see Bug reports.
- Wikipedia originally ran UseModWiki, a general wiki script by Clifford Adams. In January 2002, we switched to a PHP script, which in turn was completely overhauled the following July to create what we now call MediaWiki.
- MySQL is used for the database backend, Apache is the web server, and PowerDNS is used for DNS.
- The Wikipedia servers' operating system are Linux and Solaris. The most widely used distribution is Ubuntu. For details see Wikimedia servers.
- See m:Wikimedia servers.
ৱিকিপিডিয়া হাৰ্ডৱেৰৰ ইতিহাসসম্পাদনা কৰক
- A brief history of Wikipedia serving:
- Phase I: January 2001 - January 2002
- One of Bomis' servers hosted all Wikipedia wikis running on UseModWiki software
- Phase II: January 2002 - July 2002
- One of Bomis' servers hosted all Wikipedia wikis; English and meta running on the php/mysql-based new software, all other languages on UseModWiki. Runs both the database and the web server on one machine.
- Phase IIIa: July 2002 - May 2003
- Wikipedia gets own server, running English Wikipedia and after a bit meta, with rewritten PHP software. Runs both the database and the web server on one machine.
- One of Bomis' servers continues to host some of the other languages on UseModWiki, but most of the active ones are gradually moved over to the other server during this period.
- Phase IIIb: May 2003 - Feb 2004
- Wikipedia's server is given the code name "pliny". It serves the database for all phase 3 wikis and the web for all but English.
- New server, code name "larousse", serves the web pages for the English Wikipedia only. Plans to move all languages' web serving to this machine are put on hold until load is brought down with more efficient software or larousse is upgraded to be faster.
- One of bomis' servers continued to host some of the other languages on UseModWiki until it died. All are now hosted on pliny; a few more of the active ones have been gradually moved over to the new software, and an eventual complete conversion is planned.
- Phase IIIc: Feb 2004 to Present
- Wikipedia gets a whole new set of servers, paid for through donations to the non-profit Wikimedia Foundation.
- The new architecture has a new database server (suda), with a set of separate systems running Apache, as well as "squids" that cache results (to reduce the load). More details are at m:Wikimedia servers.
- New servers bought as needed, bringing total number to about 350 servers.
- Wikimedia have multiple facilities spread out worldwide served by different bandwidth suppliers.
- Tampa (pmtpa):
- Amsterdam facility (knams):
- 700Mbit/s to Kennisnet/AS1145
- 10Gbit/s to Tele2/AS1257
- 10Gbit/s to AMS-IX, where multiple peering partners are met
- Connection to Telia/AS1299
- South Korean facility (yaseo):
- See the Wikipedia:Statistics page for information about bandwidth usage.
- Early in Wikipedia's history, in February 2003 the database was about 4 GB in size. By August 2003, this had grown to roughly 16 GB, with uploaded images and media files taking up another gigabyte or so. By April 2004, this had grown to about 57 GB, and was growing at about 1 to 1.4 GB per week, and by October 2004, it had grown to about 170 GB. This includes all languages and support tables but not images and multimedia.
- As of late August 2006, database storage took about 1.2 terabytes:
- English Wikipedia core database: 163G
- Other Florida-based core databases: 213G
- Other Korea-based core databases: 117G
- Text storage nodes: 44G, 44G, 200G, 149G, 166G, 84G, 84G
- This may include free space inside database storage files, as well as a lot of indexing.
- Uploaded files took up approximately 372 gigabytes lc, excluding thumbnails.
- Compressed database dumps can be downloaded at http://download.wikipedia.org/.
- Wikipedia uses a very simple markup based on UseModWiki. For more details, see Wikipedia:How to edit a page.
- চমুকৈ কবলৈ গলে: সৰল আৰু সুৰক্ষাৰ বাবে।
- আৰু বিস্তাৰিত ভাবে কবলৈ হলে, ৱিকিপিডিয়া, আৰু সাধাৰণতে ৱিকিসমূহ সহজে সম্পাদনা কৰিব পৰা বিধৰ। সৰল প্ৰবন্ধ এটা লিখিবলৈ HTMLত সহজ নহয়। কোনো প্ৰবন্ধলৈ সংযোগ কৰা কথাটোকে উদাহৰণ লব পাৰি। HTMLত অসম প্ৰবন্ধলৈ সংযোগ স্থাপন কৰিবলৈ লিখিব লাগিব
- <a href="অসম">অসম</a>
- ৱিকিপিডিয়াৰ মাৰ্ক-আপ তেনেই সহজ:
- That's not true. Some HTML tags work. Also, HTML table tags were once the only way to create tables (but now it can be done by wiki syntax too). However, there's been some rumbling among the software developers that most HTML tags are deprecated.
- For discussions on wiki syntax for tables, see m:Wiki markup tables and m:MediaWiki User's Guide: Using tables for more recent activity; m:WikiShouldOfferSimplifiedUseOfTables for an old beginning activity.
- Also see Wikipedia:How to edit a page.
- Wikipedia uses Unicode (specifically the UTF-8 encoding of Unicode) and most browsers can handle it but font issues mean that more obscure characters may not work for many users. Meta:Help:Special characters page for a detailed discussion of what is generally safe and what isn't. This page will be updated over time as more browsers come to support more features.
- See http://www.unicode.org/help/display_problems.html for instructions on how to enable Unicode support for most platforms.
- কেৱল TeX ব্যৱহাৰ কৰে! ফৰ্মুলা চাওক।
- Yes, the complete text and editing history of all Wikipedia pages can be downloaded. See Wikipedia:Database download.
- Note that downloading the database dumps is much preferred over trying to spider the entire site. Spidering the site will take you much longer, and puts a lot of load on the server (especially if you ignore our robots.txt and spider over billions of combinations of diffs and whatnot). Heavy spidering can lead to your spider, or your IP, being barred with prejudice from access to the site. Legitimate spiders (for instance search engine indexers) are encouraged to wait about a minute between requests, follow the robots.txt, and if possible only work during less loaded hours (2:00-14:00 UTC is the lighter half of the day).
- The uploaded images and other media files are not currently bundled in an easily downloadable form; if you need one, please contact the developers on the wikitech-l mailing list. Please do not spider the whole site to get images.
- Ed Summers has written WWW::Wikipedia.
- If you're just after retrieving a topic page, the following Perl sample code works. In this case, it retrieves and lists the Main Page, but modifications to the $url variable for other pages should be obvious enough. Once you've got the page source, Perl regular expressions are your friend in finding wiki links.
$browser = LWP::UserAgent->new();
$url = "http://en.wikipedia.org/wiki/Wikipedia%3AMain_Page";
$webdoc = $browser->request(HTTP::Request->new(GET, $url));
if ($webdoc->is_success) #...then it's loaded the page OK
print $webdoc->title, "\n\n"; # page title
print $webdoc->content, "\n\n"; # page text
- Note that all (English) Wikipedia topic entries can be accessed using the conventional prefix "
http://en.wikipedia.org/wiki/", followed by the topic name (with spaces turned into underscores, and special characters encoded using the standard URL encoding system).
- See also m:Machine-friendly wiki interface.
- Cookies are not required to read or edit Wikipedia, but they are required in order to log in and link your edits to a user account.
- When you log in, the wiki will set a temporary session cookie which identifies your login session; this will be expired when your browser exits (or after an inactivity timeout), and is not saved on your hard drive.
- Another cookie will be saved which lists the user name you last logged in under, to make subsequent logins just a teensy bit easier. (Actually two: one with your name, and one with your account's internal ID number; they must match up.) These cookies expire after 180 days. If this worries you, clear your cookies after completing your session.
- If you check the "remember my password" box on the login form, another cookie will be saved with a hash of your password (not the password itself). As long as this remains valid, you can bypass the login step on subsequent visits to the wiki. The cookie expires after 180 days, or is removed if you log out. If this worries you, don't use the option. (You should not use it on a public terminal!)
- This could be a result of your cookie, browser cache, or firewall/Internet security settings. Or, to quote Tim Starling (referring to a question about "remembering password across sessions"):
- "The kind of session isn't a network session strictly speaking, it's an HTTP session, managed by PHP's session handling functions. This kind of session works by setting a cookie, just like the "remember password" feature. The difference is that the session cookie has the "discard" attribute set, which means that it is discarded when you close your browser. This is done to prevent others from using your account after you have left the computer.
- The other difference is that PHP sessions store the user ID and other such information on the server side. Only a "session key" is sent to the user. The remember password feature stores all required authentication information in the cookie itself. On our servers, the session information is stored in memcached, a system for non-durable (unreliable) caching. Session information may occasionally be lost or go missing temporarily, causing users to be logged out. The simplest workaround for this is to use the remember password feature, as long as you are not worried about other people using the same computer." from the Wikipedia:Village pump (technical) on May 4, 2005. (italics added).
- In other words: click the "remember me" box when logging in.
- See also Help:Logging in.
- You can, but depending on your needs you might be better served using something else; MediaWiki is big and complex. See first Wiki software for a list of wiki scripts.
- If after scanning that you're still sure you want to use MediaWiki; see the MediaWiki web site for details on downloading, installing and configuring the software.
- Page hit counting is a feature of the MediaWiki software, but this feature is disabled at the Wikipedia site for performance reasons. Wikipedia is one of the most popular web sites in the world and uses a cluster of more than 400 servers (জানুৱাৰী 2011 অনুসৰি ) to handle the load. Nearly 80% of the load is handled by about 100 front end cache servers which store copies of pages so they can be served without having to be rebuilt each time from the database. Hitcount data is therefore not collected centrally, but is aggregated from all the servers and is available at http://stats.grok.se/.
- You can also view the page hits for a particular page from the history for that page; and then choose Page view statistics listed as an external tool.
- To view a low-bandwidth Main Page suitable for wireless users, select the Wikipedia:Main Page alternative (simple layout) link. That main page has a link to the text-only version of the main page. For now, direct entry of the URL into your wireless device's browser is the most convenient way to get to the articles. If you know a one-word article, such as Science, you can use that article to gain entry to your favorite topics.
- Also if you log in, then try selecting the chick skin in your preferences to eliminate all the stuff from the edge of the screen, and give you more space to read the articles themselves.
- No, although it's random enough to provide a small sample of articles reliably.
- We have an index on the page table called page_random, which is a random floating point number uniformly distributed on [0, 1). Special:Random chooses a random double-precision floating-point number, and returns the next article with a page_random value higher than the selected random number. Some articles will have a larger gap before them, in the page_random index space, and so will be more likely to be selected. So the actual probability of any given article being selected is in fact itself random.
- The page_random value for new articles, and the random value used by Special:Random, is selected by reading two 31-bit words from a Mersenne Twister, which is seeded at each request by PHP's initialisation code using a high-resolution timer and the PID. The words are combined using:
- (mt_rand() * $max + mt_rand()) / $max / $max
- Some old articles had their page_random value reset using MySQL's RAND():
- rand_st->seed1=(rand_st->seed1*3+rand_st->seed2) % rand_st->max_value;
- rand_st->seed2=(rand_st->seed1+rand_st->seed2+33) % rand_st->max_value;
- return (((double) rand_st->seed1)/rand_st->max_value_dbl);
- সমস্যা সমাধান পৃষ্ঠাত চাওক - যদি তাতো আপোনাৰ প্ৰশ্নৰ উত্তৰ নাই ৰাইজৰ চ'ৰাত প্ৰশ্নটো উত্থাপিত কৰিব পাৰে।