No Data
No Data
ETF tracking | Volatility double long ETF rose by 16%; Leveraged long ETF for super micro computer plummeted over 70% in two days
On Thursday, October 31, the three major stock indexes in the United States all fell. The s&p 500 index closed down 108.22 points, a decrease of 1.86%, to 5705.45 points. The October cumulative decline was 0.99, ending the continuous rise since May. The dow jones industrial average fell 378.08 points, a decrease of 0.90%, to 41763.46 points. The October cumulative decline was 1.34%, ending the uptrend since May. The nasdaq composite index, mainly composed of technology stocks, also closed lower.
OpenAI has reached an agreement with the media giant hess corp, providing ChatGPT with high-quality content assistance again.
Partnerships include Esquire, Elle, and other heavyweight magazine content; this ai startup company has previously reached cooperation agreements with news groups and other news publishers.
Germany's antitrust institutions are signaling a red flag, microsoft AI applications and its collaboration with OpenAI are facing stricter scrutiny.
German regulatory agency FOC has identified microsoft as a "company crucial for cross-market competition", which means that once FOC deems intervention necessary, it will take action against microsoft to prohibit the company's "anti-competitive behavior".
OpenAI wants to spend 7 trillion US dollars to build a wafer factory.
At the end of last year, OpenAI CEO Sam Altman began promoting a bold plan.
Will OpenAI, which has been 'necked' by non-profit organizations, eventually break free and go public?
However, some analysts pointed out that if OpenAI considers conducting its initial public offering (IPO) in the USA in the future, the company will face many challenges, with the most significant being rewriting its corporate charter.
The revolutionary O1 model has been unveiled, and OpenAI has warned that the risk of biological weapons has also increased.
①o1 faces a 'medium' risk on issues related to "chemical, biological, radiological, and nuclear" weapons - this is also the highest risk rating ever given to its model by OpenAI; ②Experts say that AI software with more advanced functions (such as the ability to perform step-by-step reasoning) is more likely to be abused in the hands of malicious actors.