First Joint WOSP/SIPEW International
Conference on Performance Engineering

San Jose, California, USA
January 28-30, 2010
cooperation with and with the generous support of SPEC
Conference Schedule
& Technical Program
Organizing Committee
Program Committee
Important Dates
Call for Participation
US Visa
Venue and Hotel
Corporate Support
WOSP 2008
SIPEW 2008
Contact Us

Call for Papers - download .pdf file

     The goal of the 1st Joint WOSP/SIPEW International Conference on Performance Engineering is to provide a forum for ideas in performance engineering of software and systems, including performance measurement, modeling, benchmark design, and run-time performance management. WOSP/SIPEW is established as a joint meeting of the ACM Workshop on Software and Performance (WOSP) and the SPEC International Performance Evaluation Workshop (SIPEW).

     Since its inception in 1998, WOSP has brought together software engineers, developers, performance analysts and software/performance modelers who are addressing the challenges of increasing system complexity, rapidly evolving software technologies, short time to market, incomplete documentation, and less-than-adequate methods, models and tools for developing, modeling, and measuring scalable, high-performance software. The focus of WOSP is therefore in the intersection of software and performance, rather than one discipline in isolation.

     SIPEW was established by the Standard Performance Evaluation Corporation (SPEC) with the goal to bridge the gap between theory and practice in the field of system performance evaluation by providing a forum for sharing ideas and experiences between industry and academia. The workshop brings together researchers and industry practitioners to share and present their experiences, discuss challenges, and report state-of-the-art and in-progress research in all aspects of performance evaluation.

     The conference is co-located with the SPEC Annual Meeting 2010 which will be attended by numerous representatives from across the hardware and software industry. This will provide a unique opportunity for researchers to meet with industry practitioners.

Topics of interest include (but are not limited to) the following areas:
  • Performance modeling of software
    • Languages and ontologies
    • Methods and tools
    • Composing models
    • Capturing user behavior
  • Performance measurement and analysis
    • Software performance testing
    • Application performance measurement and monitoring
    • Analysis of measured application performance data
  • Performance and other quality attributes
    • Relationship/integration/tradeoffs with other QoS attributes
    • Relationship/integration/tradeoffs with cost and schedule
  • Performance and development processes
    • Software performance patterns and anti-patterns
    • Software/performance tool interoperability (models and data interchange formats)
    • Performance-oriented design, implementation and configuration management
    • Software Performance Engineering and Model-Driven Development
    • Gathering, interpreting and exploiting software performance annotations and data
  • Performance modeling methodologies
    • Analytical, simulation and statistical modeling methodologies
    • Model validation and calibration techniques
  • Performance modeling and analysis tools
  • Model-driven performance testing, measurement, and experimental design
  • Model-driven performance requirements engineering
  • Benchmarking
    • Performance metrics and benchmark suites
    • Benchmarking methodologies
    • Development of parameterizable, flexible benchmarks
    • Use of benchmarks in industry and academia
  • Workload characterization and experimental performance evaluation
    • Workload characterization techniques
    • Application tracing and profiling
    • Performance tuning and optimization
    • Tools for performance measurement, profiling and tuning
  • Run-time performance management
    • Use of models at run-time
    • Online performance prediction
    • Autonomic resource management
    • Utility-based optimization
  • Performance evaluation in different environments and application domains, including, but not limited to:
    • Service-oriented architectures (SOA)
    • Web-based systems, e-business, Web services
    • Transaction-oriented systems
    • Virtualization platforms
    • Communication networks
    • Parallel and distributed systems
    • Embedded and autonomous systems
    • Cluster and grid computing environments
    • High performance computing
    • Event-based systems
    • Real-time and multimedia systems
    • Peer-to-peer, mobile and wireless systems
  • Power and performance, energy efficiency
     Various different kinds of papers are sought: basic research, novel applications, and industrial experience, with different criteria for each category. Experience reports are particularly sought.


Deadline for submission of papers and tutorial proposals July 21st 2009
Notification of acceptance to authors September 3rd 2009
Final papers and completed copyright transfer forms due at the publisher web site October 5th 2009
Deadline for advanced registration December 31st 2009
Technical Sessions January 28-30 2010

  • Full research papers (up to 12 pages, using ACM format found at

  • Industrial experience reports which present unmet needs, or describe the application of a performance technique in enough detail that its effectiveness can be evaluated. They can be any appropriate length up to the 12 page limit described above.

  • Short research/industrial papers (up to 6 pages in ACM format)

  • Poster session papers (up to 4 pages in ACM format)

  • Tutorial proposals, up to 2 pages.

     There will also be an hour-long work-in progress session. Participants are invited to submit a title and abstract of a half a page for this in advance, and will be given up to five minutes for a quick presentation.