To To This post was last edited by The Cauliflower Thief at 2020-7-2011:31 The younger brother has a collection station, only collected more than 8000, the database is more than 4G A total of 190,000 URLs have been imported. According to this progress, the database is more than 80G, and the average chicken can't do anything. siteras.png (76.91KB) (Downloads:0, uploaded at 11:31 the day before yesterday)

-------------------------------------------- ---------
Netizens reply: To To Do a good job of indexing, sub-table, and caching. Not a big problem
Netizen reply: To To Quote: Zhou Runfa published on 2020-7-2011: 32 Why is my database of 6000 articles only 5m?
Netizen Reply: To To There are hundreds of thousands of them, and 4.2G can still run for the time being (some tens of thousands of which are less). win2008 system. That is, it needs to be cached for half an hour when it is first turned on, and it will be normal after half an hour, and then restart for 20 days without any problem.
Netizen Reply: To To Quote: squalll published on 2020-7-2011:38 There are hundreds of thousands of them, and 4.2G can still run for the time being (some tens of thousands of which are less). win2008 system. It means to cache for half an hour, half an hour when the machine is first turned on...
Netizens reply: To To The station group matrix can reach more than 600G at most. The index is well done, and there is no big problem. In addition, it also depends on the program. If the SQL statement does not work, it will be very resource intensive. For example, if you have a wordpress with 2 million articles, you will want to die.
Netizen Reply: To To Quote: No idea published on 2020-7-2011:44 Isn't it good for you to save the text directly? mysql only stores the index
Netizens reply: To To Quote: Xiaoye published on 2020-7-2011:45 The station group matrix can reach more than 600G at most. The index is well done, and there is no big problem. In addition, it also depends on the program. If the SQL statement does not work, it will be very resource-intensive...
Netizen Reply: To To Quote: xcpan710 published on 2020-7-2011:55 Why is it crazy to save text in the database? Isn’t it good to write into a txt file?
Netizens reply: To To Isn't it? 8000 records are 4G? My more than 800,000 records are only about 2.5G. Both are records of article type. There are quite a few words in each article. Why is yours so big.
Netizen Reply: To To Quote: ouou8 published on 2020-7-2013:01 Isn't it? 8000 records are 4G? My more than 800,000 records are only about 2.5G. Both are records of article type. There are quite a few words in each article. You...
Netizen Reply: To To This post was last edited by ghostcir at 2020-7-2014:47 I have saved 1000W posts, all of them are only 30G
Netizens reply: To To Reference Empire cms~ Jieqi's also mysql hundreds of millions of data are OK~
Netizens reply: To To Do a good job of caching. Use a cloud database or an intranet machine to specialize in a database
Netizen reply: To To Quote: Xiaoye published on 2020-7-2011:45 The station group matrix can reach more than 600G at most. The index is well done, and there is no big problem. In addition, it also depends on the program. If the SQL statement does not work, it will be very resource-intensive...
Netizen Reply: To To Quote: trips published on 2020-7-2014:51 Reference Empire cms~ Jieqi's also mysql hundreds of millions of data are OK~
Netizens reply: To To Quote: ouou8 published on 2020-7-2015:51 Jieqi text is stored in a txt database with only titles and a little introduction. Very small amount. If hundreds of millions of pieces of data are all text, that's great. ...

Label: none