It looks like around half od the BE lines went off around 02:43 and the rest at 02:58.
We were not aware of planned work this morning, though various work has been planned this week.
We have tried routing traffic via our old rack as well, and no joy.
Obviously customers using BE+BT lines as a fallback arrangement are fine, using BT.
Obviously customers using 3G backup are working.
If you only have a BE line you may want to consider some of the fallback arrangements we can offer. Though it is rare for it to be BE that is the carrier that has failed, it is sensible to consider contingencies.
That said, we would hope they have this fixed during the night.
There were unplanned outtages on 14 exchanges at approximately those times, we're still waiting to find out why this occured.
O2 have confirmed that changes made by them to the network overnight resulted in loss of service to large parts of the network, including all wholesale services.
Their engineers are currently rebuilding configurations on all aggregation nodes on their network to reverse the changes. We are beginning to see sessions establish again, about 20% have reconnected in the past few minutes.
O2 engineers have restored configuration on about 40% of core devices. We are seeing roughly that % of sessions back up.
It appears that O2 engineers have now finished restoring configurations, as we can see all ISAMs on our monitoring again and we are seeing the majority of customers back online.
This was due to BE engineering work that failed. An initial report of the problem is available at http://aa.net.uk/news-2012-02-10-be.html