Layla My blog
sábado, 6 de octubre de 2012
Oracle Misfires in Fiscal 2Q, Raising Tech Worries
Oracle Misfires in Fiscal 2Q, Raising Tech Worries
Oracle stumbled in its latest quarter as the business software maker struggled to close deals, a signal of possible trouble ahead for the technology sector.
The performance announced Tuesday covered a period of economic turbulence which has raised concerns that major companies and government agencies may curtail technology spending.
Oracle's results for the three months ending in November suggested the cutbacks have already started. Management reinforced that perception with a forecast calling for meager growth in the current quarter, which ends in February. The developments alarmed investors, causing Oracle Corp. shares to slide 10 percent.
In a telling sign of weakening demand, Oracle's sales of new software licenses edged up just 2 percent from the same time last year. Analysts had expected a double-digit gain in new software licenses. The company had predicted an increase of at least 6 percent and as much as 16 percent.
Wall Street focuses on this part of the business because selling new software products generates a stream of future revenue from maintenance and upgrades.
Oracle's software is a staple in companies and government agencies throughout the world. Its database products help companies store and manage information . Its line of applications automates a wide range of administrative tasks.
Part of the problem was that technology decision makers delayed signing contracts during the final few days of the quarter, according to Safra Catz, Oracle's chief financial officer. That could be an indication that companies and financially strapped government agencies are treading more carefully as Europe's debt problems threaten to hobble a still-fragile global economy.
"Clearly, this quarter was not what we thought it would be," Catz told analysts during a Tuesday conference call. She said the company is hoping some of the deals that were postponed in the last quarter will be completed within the next two months.
Oracle's weakest markets were in the U.S., Europe and Japan.
Things looked even bleaker in Oracle's computer hardware division, which the company has been trying to build since buying fallen Silicon Valley star Sun Microsystems for $7.4 billion last year. Oracle's hardware revenue dropped 10 percent from the same time last year
Oracle earned $2.2 billion, or 43 cents per share, in its fiscal second quarter. That was a 17 percent increase from net income of $1.9 billion, or 37 cents per share at the same time last year.
If not for certain items, Oracle said it would have earned 54 cents per share. That figure fell below the average estimate of 57 cents per share among analysts polled by FactSet.
Revenue for the period edged up 2 percent from last year to $8.8 billion. Analysts, on average, had projected revenue of $9.2 billion.
In the current quarter, Oracle expects its adjusted earnings per share to range from 55 cents to 58 cents -- below the average analyst estimate of 59 cents per share. Revenue is expected to rise by 2 percent to 5 percent from the same time in the previous year. If Oracle hits that top end of that target, it would translate to revenue of about $9.2 billion -- below the analyst estimate of $9.4 billion, according to FactSet.
Oracle has expanded its sales force by about 1,700 people to fish for more customers during the second half of its fiscal year. About 111,000 employees worked at Oracle as of Nov. 30.
The company's shares shed $2.91 to hit $26.26 in extended trading after the second-quarter figures were released. The stock has been sagging since hitting $36.50 in May.
In an effort to bolster the stock, Oracle announced it will spend an additional $5 billion to buy back its shares. The company, which is based in Redwood Shores, Calif., didn't set a timetable to complete the stock purchases. Oracle spent about $1 billion buying 33 million shares in its most recent quarter.
viernes, 7 de septiembre de 2012
Data Mining
Locality Sensitive Hashing for Documents
Even though we can use minhashing to compress large documents into small
signatures and preserve the expected similarity of any pair of documents, it
still may be impossible to find the pairs with greatest similarity efficiently.
The reason is that the number of pairs of documents may be too large, even if there
are not too many documents.
Example : Suppose we have a million documents, and we use signatures
of length 250. Then we use 1000 bytes per document for the signatures, and
the entire data fits in a gigabyte – less than a typical main memory of a laptop.
However, there are 1,000,000 / 2 or half a trillion pairs of documents. If it takes a
microsecond to compute the similarity of two signatures, then it takes almost
six days to compute all the similarities on that laptop.
If our goal is to compute the similarity of every pair, there is nothing we
can do to reduce the work, although parallelism can reduce the elapsed time.
However, often we want only the most similar pairs or all pairs that are above
some lower bound in similarity. If so, then we need to focus our attention only
on pairs that are likely to be similar, without investigating every pair.
There is a general theory of how to provide such focus, called locality-sensitive hashing
(LSH) or near-neighbor search. In this section we shall consider a specific form
of LSH, designed for the particular problem we have been studying: documents,
represented by shingle-sets, then minhashed to short signatures. In Section 3.6
we present the general theory of locality-sensitive hashing and a number of
applications and related techniques.
jueves, 12 de julio de 2012
Cloud computing in June 2012
Though the concept of “clouds” is not new, it is undisputable that they have proven a major
commercial success over recent years and will play a large part in the ICT domain over the next 10
years or more, as future systems will exploit the capabilities of managed services and resource
provisioning further. Clouds are of particular commercial interest not only with the growing
tendency to outsource IT so as to reduce management overhead and to extend existing, limited IT
infrastructures, but even more importantly, they reduce the entrance barrier for new service
providers to offer their respective capabilities to a wide market with a minimum of entry costs and
infrastructure requirements – in fact, the special capabilities of cloud infrastructures allow providers
to experiment with novel service types whilst reducing the risk of wasting resources.
Cloud systems are not to be misunderstood as just another form of resource provisioning
infrastructure and in fact, as this report shows, multiple opportunities arise from the principles for
cloud infrastructures that will enable further types of applications, reduced development and
provisioning time of different services. Cloud computing has particular characteristics that
distinguish it from classical resource and service provisioning environments:
(*) it is (more-or-less) infinitely scalable; (*) it provides one or more of an infrastructure for
platforms, a platform for applications or applications (via services) themselves; (3) thus clouds can
be used for every purpose from disaster recovery/business continuity through to a fully outsourced
ICT service for an organisation; (*) clouds shift the costs for a business opportunity from CAPEX to
OPEX which allows finer control of expenditure and avoids costly asset acquisition and maintenance
reducing the entry threshold barrier; (*) currently the major cloud providers had already invested in
large scale infrastructure and now offer a cloud service to exploit it; (*) as a consequence the cloud
offerings are heterogeneous and without agreed interfaces; (*) cloud providers essentially provide
datacentres for outsourcing; (*) there are concerns over security if a business places its valuable
knowledge, information and data on an external service; (*) there are concerns over availability and
business continuity – with some recent examples of failures; (*) there are concerns over data
shipping over anticipated broadband speeds.
The concept of cloud computing is linked intimately with those of IaaS (Infrastructure as a Service);
PaaS (Platform as a Service), SaaS (Software as a Service) and collectively *aaS (Everything as a
Service) all of which imply a service-oriented architecture.
Open Res earch Issues
CLOUD TECHNOLOGIES AND MODELS HAVE NOT YET REACHED THEIR FULL POTENTIAL AND MANY OF THE CAPA BILITIES
ASSOCIATED WITH CLOUDS ARE NOT YET DEVELO PED AND RESEARCHED TO A DEGREE THAT ALLOWS THEIR EXPLOITATION
TO THE FULL DEGREE, RESPECTIVELY MEETING ALL REQUIREMENTS UNDER ALL POTENTIAL CIRCUMSTANCES OF USAGE.
Many aspects are still in an experimental stage where the long-term impact on provisioning and
usage is as yet unknown. Furthermore, plenty of as yet unforeseen challenges arise from exploiting
the cloud capabilities to their full potential, involving in particular aspects deriving from the large
degree of scalability and heterogeneity of the underlying resources. We can thereby distinguish
between technological gaps on the one hand, that need to be closed in order to realize cloud
infrastructures that fulfil the specific cloud characteristics and non-technological issues on the other
hand that in particular reduce uptake and viability of cloud systems:
To the technological aspects belong in particular issues related to (1) scale and elastic scalability,
which is not only currently restricted to horizontal scale out, but also inefficient as it tends to
resource over usage due to limited scale down capabilities and full replication of instances rather
than only of essential segments. (*) Trust, security and privacy always pose issues in any internet
provided service, but due to the specific nature of clouds, additional aspects related e.g. to multitenancy arise and control over data location etc. arise. What is more, clouds simplify malicious use
of resources, e.g. for hacking purposes, but also for sensitive calculations (such as weapon design)
etc. (*) Handling data in clouds is still complicated - in particular as data size and diversity grows,
pure replication is no viable approach, leading to consistency and efficiency issues. Also, the lacking
control over data location and missing provenance poses security and legalistic issues. (*)
Programming models are currently not aligned to highly scalable applications and thus do not
exploit the capabilities of clouds, whilst they should also simplify development. Along the same line,
developers, providers and users should be able to control and restrict distribution and scaling
behaviour. This relates to (5) systems development and management which is currently still
executed mostly manually, thus contributing to substantial efficiency and bottleneck issues.
On the other hand, non-technological issues play a major role in realizing these technological
aspects and in ensuring viability of the infrastructures in the first instance. To these belong in
particular (*) economic aspects which cover knowledge about when, why, how to use which cloud
system how this impacts on the original infrastructure (provider) –long-term experience is lacking in
all these areas; and (*) legalistic issues which come as a consequence from the dynamic (location)
handling of the clouds, their scalability and the partially unclear legislative issues in the internet.
This covers in particular issues related to intellectual property rights and data protection. In
addition, (*) aspects related to green IT need to be elaborated further, as the cloud offers principally
“green capabilities” by reducing unnecessary.
miércoles, 20 de junio de 2012
The SCM-CRM Connection
The various disciplines framework is useful for viewing how
traditional SCM and CRM approaches have evolved and why
they will inevitably fall short in terms of generating strategic
advantage in the modern marketplace.
For example, can be a scale-focused, operationally efficient organization
typically concentrates its efforts on refining its supply
chain. Cost is everything, leading the company in its most not only to
reduce its inventories and streamline its procurement
processes (using, for example, electronic data interchange,
vendor-managed inventory, and also just-in-time processes), but
also to narrow its products and services to as few offerings
as possible. In this way, the company could reasonably
ensure that it could manage the entire supply chain in a reasonably
cost-efficient manner. Customer focus isn’t entirely
absent.
It’s simply that the overwhelming share of management
attention is elsewhere. The rule: build it – or provide it
– at the lowest cost, and customers will come.
At the other end of the spectrum are customer-intimate
competitors. These firms focus on building rich customer
relationships rather than optimizing the supply chain. Supply
chain management is a function that must be at least
competently executed by such a firm, but the company’s
attention is centered on offering products and services that
improve its relationships with key clients.
Product leadership companies occupy the middle ground of
the SCM-CRM continuum. While relying on customer
insights and CRM to anticipate customer needs, a steady
stream of new products requiring new suppliers also means
that such a company is likely to see SCM as a critical success
factor in its business. The issue, however, is that collaborative
planning of supply and demand remains haphazard
because each new product or service is managed almost like
a startup. Whatever its capabilities in SCM or CRM, these
companies must continually integrate their processes with
new suppliers and customers.
lunes, 26 de marzo de 2012
Key points of ATM technology
ATM is based on a set of technological innovations that allow it meets the requirements demanded.
Standardization
Although its origins date back to the 60, is from 1988 when the CCITT ratifies ATM as the technology for the development of broadband networks (B-ISDN), appearing the first standards in 1990.
From then until now ATM has been subjected to a rigorous process of standardization, not only for simple interoperability at the physical level (SONET and SDH speeds ...) but to ensure multifabricantes networking service level, issues such as standardizing Signaling (UNI, NNI), congestion control, LAN integration, and so on.
This feature ensures multivendor networking, guaranteeing investment and allow a strong market development, thereby reducing costs.
Cell based multiplexing
To be able to properly manage the bandwidth on a link, it is necessary that the different sources that submit their data used in minimum units of information.
For ATM was decided a minimum unit of 53 bytes fixed size. The use of a fixed size can develop very specialized hardware modules that switch these cells at the speeds required in broadband (and future). The length of the unit should be small so quickly can be multiplexed on the same link cells from different sources and to guarantee service quality for sensitive traffic (voice, video, ...)
Connection-oriented
ATM out that a connection-oriented technology allowing, among other things, to achieve a minimum information unit of small size. As mentioned earlier, growth forecasts for ATM required the use of a numbering system terminals 20 bytes. Technologies do not require connection-oriented that each unit of information contains the addresses within both origin and destination. Obviously, they could not devote 40 bytes of the cell for that purpose (for header overhead would be unacceptable).
Addressing the only data that is included in the cell is the virtual channel identification means, only, five byte header (48 bytes for transmitting useful information).
Quality of Service (QoS)
Defines four basic categories of traffic: CBR (Constant Bit Rate), VBR (Variable Bit Rate), UBR (Undefined Bit Rate) and AVR (Available Bit Rate)
At the time of creation, the DTE characterizes the traffic to be sent through the circuit using four parameters (PCR, SCR, and MBS CDVT) within one of these four categories. The network propagates the request to its destination and internally valid if the requirements will be required to comply. If so, the network accepts the circuit and from that time, ensures that traffic be treated according to the terms negotiated in the establishment.
The ATM switches run an algorithm called dual leaky buckets guarantees, cell by cell, which is offering the required quality of service. Is it permissible for the DTE to send data over a circuit of the negotiated rate. In that case the ATM switch can proceed to discard the cells concerned in the case of saturation at any point in the network.
Intelligent Network
ATM transport network is an intelligent network where each node that composes a separate item. As mentioned above, the switches forming the ATM network individually discover the network topology of your environment through a dialog protocol between nodes.
This type of approach, novel in broadband networks, open the door to a new world of functionality (different speed links, flexible topology, traffic balancing, scalability, ...) and is without a doubt, the cornerstone ATM technology.
martes, 6 de marzo de 2012
Oracle Misfires in Fiscal 2Q, Raising Tech Worries
Oracle Misfires in Fiscal 2Q, Raising Tech Worries
Oracle stumbled in its latest quarter as the business software maker struggled to close deals, a signal of possible trouble ahead for the technology sector.
The performance announced Tuesday covered a period of economic turbulence which has raised concerns that major companies and government agencies may curtail technology spending.
Oracle's results for the three months ending in November suggested the cutbacks have already started. Management reinforced that perception with a forecast calling for meager growth in the current quarter, which ends in February. The developments alarmed investors, causing Oracle Corp. shares to slide 10 percent.
In a telling sign of weakening demand, Oracle's sales of new software licenses edged up just 2 percent from the same time last year. Analysts had expected a double-digit gain in new software licenses. The company had predicted an increase of at least 6 percent and as much as 16 percent.
Wall Street focuses on this part of the business because selling new software products generates a stream of future revenue from maintenance and upgrades.
Oracle's software is a staple in companies and government agencies throughout the world. Its database products help companies store and manage information . Its line of applications automates a wide range of administrative tasks.
Part of the problem was that technology decision makers delayed signing contracts during the final few days of the quarter, according to Safra Catz, Oracle's chief financial officer. That could be an indication that companies and financially strapped government agencies are treading more carefully as Europe's debt problems threaten to hobble a still-fragile global economy.
"Clearly, this quarter was not what we thought it would be," Catz told analysts during a Tuesday conference call. She said the company is hoping some of the deals that were postponed in the last quarter will be completed within the next two months.
Oracle's weakest markets were in the U.S., Europe and Japan.
Things looked even bleaker in Oracle's computer hardware division, which the company has been trying to build since buying fallen Silicon Valley star Sun Microsystems for $7.4 billion last year. Oracle's hardware revenue dropped 10 percent from the same time last year
Oracle earned $2.2 billion, or 43 cents per share, in its fiscal second quarter. That was a 17 percent increase from net income of $1.9 billion, or 37 cents per share at the same time last year.
If not for certain items, Oracle said it would have earned 54 cents per share. That figure fell below the average estimate of 57 cents per share among analysts polled by FactSet.
Revenue for the period edged up 2 percent from last year to $8.8 billion. Analysts, on average, had projected revenue of $9.2 billion.
In the current quarter, Oracle expects its adjusted earnings per share to range from 55 cents to 58 cents -- below the average analyst estimate of 59 cents per share. Revenue is expected to rise by 2 percent to 5 percent from the same time in the previous year. If Oracle hits that top end of that target, it would translate to revenue of about $9.2 billion -- below the analyst estimate of $9.4 billion, according to FactSet.
Oracle has expanded its sales force by about 1,700 people to fish for more customers during the second half of its fiscal year. About 111,000 employees worked at Oracle as of Nov. 30.
The company's shares shed $2.91 to hit $26.26 in extended trading after the second-quarter figures were released. The stock has been sagging since hitting $36.50 in May.
In an effort to bolster the stock, Oracle announced it will spend an additional $5 billion to buy back its shares. The company, which is based in Redwood Shores, Calif., didn't set a timetable to complete the stock purchases. Oracle spent about $1 billion buying 33 million shares in its most recent quarter.
Training Dynamics CRM 4.0 immersion courses
Training Dynamics CRM 4.0 immersion courses
Training is one of the foundations of CRM adoption – if people don’t know how to use a system, and if it’s not clear how a system will benefit them, they won’t use it. The same goes for the partners who implement CRM systems.
So how do you take someone with an IT background and bring them up to speed with CRM in a hurry? One solution might be an immersive course – stick them in a CRM boot camp and drill them until they know a CRM solution backward and forward.
“Actually, I hate the term ‘boot camp,’” says David Minutella, vice president of education for Training Camp, a developer of IT training programs that’s rolling out just such an accelerated set of courses dealing with Microsoft Dynamics CRM 4.0. “We call it accelerated training – we get you in there, house you, and then have you in a classroom for 10 hours a day so you can focus what you need to learn.”
Having attended a boot camp, I can assure you that a program like this would suffer little from an absence of tear gassing, draconian haircutting policies and random yelling, although occasional forced push-ups might help focus attention, but that’s neither here nor there. The program takes students from their home organizations and houses and feeds the students, who have access to lab facilities – and instructors – 24 hours a day. Minutella says the outcome-based learning of this program is based around a lecture-lab-review format, or as he puts it simply: “we teach it, teach it again, then teach it one more time.”
The two courses, Microsoft Dynamics CRM 4.0 Applications and Microsoft Dynamics CRM 4.0 Installations and Deployment, aim to instill in-depth knowledge of the configuration, management and customization of the application, as well as instruction on new features, installation and networking with Windows Vista. The take away for the companies sending the pupils to the accelerated training is an in-house expert on Dynamics CRM 4.0 who’s ready to work on the solution upon arriving home after just a few days away away (Applications is 10 days and Installations and Deployment is seven days, but the training doesn’t have to be bak-to-back); for students, completing the course and exams means they’ll get the Microsoft Certified Business Management Solutions Specialist and Professional Application certifications.
The program launched last month, and Minutella said “there’s already been quite a bit of interest.” Training Camp also wants to launch a third track dealing with customizations to Dynamics CRM 4.0.
Suscribirse a:
Entradas (Atom)